Apr 21 07:08:34.380396 ip-10-0-139-104 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 07:08:34.380408 ip-10-0-139-104 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 07:08:34.380416 ip-10-0-139-104 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 07:08:34.380664 ip-10-0-139-104 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 07:08:45.848263 ip-10-0-139-104 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 07:08:45.848290 ip-10-0-139-104 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b304a3ce5a6e4fc09353bf77f75682f1 -- Apr 21 07:11:04.355572 ip-10-0-139-104 systemd[1]: Starting Kubernetes Kubelet... Apr 21 07:11:04.743001 ip-10-0-139-104 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:11:04.743001 ip-10-0-139-104 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 07:11:04.743001 ip-10-0-139-104 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:11:04.743001 ip-10-0-139-104 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 07:11:04.743001 ip-10-0-139-104 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:11:04.744423 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.744333 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 07:11:04.748831 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748811 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:11:04.748831 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748830 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:11:04.748831 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748834 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748838 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748841 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748845 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748848 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748857 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748860 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748863 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748866 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748869 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748872 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748875 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748877 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748880 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748883 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748885 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748888 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748891 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748893 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748896 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:11:04.748929 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748899 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748901 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748904 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748906 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748910 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748913 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748916 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748919 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748922 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748924 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748927 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748930 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748934 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748936 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748939 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748944 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748948 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748951 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748954 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:11:04.749422 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748957 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748960 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748963 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748966 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748970 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748973 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748975 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748978 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748980 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748983 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748985 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748988 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748990 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748993 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748996 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.748999 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749001 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749004 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749008 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:11:04.749899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749010 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749013 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749015 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749018 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749020 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749023 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749026 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749029 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749031 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749034 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749037 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749039 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749042 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749045 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749048 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749050 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749052 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749055 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749057 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749060 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:11:04.750377 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749063 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749066 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749068 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749071 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749074 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.749076 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750047 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750054 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750058 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750061 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750064 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750066 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750069 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750072 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750074 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750077 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750080 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750083 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750086 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750089 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:11:04.750904 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750091 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750094 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750097 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750099 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750102 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750106 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750108 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750111 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750113 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750116 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750120 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750124 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750126 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750129 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750132 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750134 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750137 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750139 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750142 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750144 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:11:04.751408 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750147 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750149 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750152 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750156 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750159 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750161 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750164 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750167 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750169 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750172 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750175 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750177 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750180 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750182 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750185 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750187 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750190 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750192 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750195 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750198 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:11:04.751901 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750200 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750203 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750206 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750208 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750211 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750213 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750215 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750218 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750221 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750224 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750226 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750228 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750231 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750233 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750236 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750238 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750242 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750244 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750246 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:11:04.752421 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750250 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750264 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750270 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750272 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750275 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750278 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750280 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750283 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750285 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750288 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750290 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750293 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750296 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750370 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750378 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750385 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750390 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750395 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750398 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750403 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750407 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 07:11:04.752899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750411 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750414 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750417 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750421 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750424 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750427 2576 flags.go:64] FLAG: --cgroup-root="" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750430 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750433 2576 flags.go:64] FLAG: --client-ca-file="" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750436 2576 flags.go:64] FLAG: --cloud-config="" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750439 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750442 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750447 2576 flags.go:64] FLAG: --cluster-domain="" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750450 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750455 2576 flags.go:64] FLAG: --config-dir="" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750458 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750461 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750465 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750468 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750471 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750475 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750478 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750481 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750485 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750488 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750491 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 07:11:04.753433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750496 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750499 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750502 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750505 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750508 2576 flags.go:64] FLAG: --enable-server="true" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750511 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750516 2576 flags.go:64] FLAG: --event-burst="100" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750519 2576 flags.go:64] FLAG: --event-qps="50" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750522 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750525 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750528 2576 flags.go:64] FLAG: --eviction-hard="" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750532 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750535 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750538 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750541 2576 flags.go:64] FLAG: --eviction-soft="" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750544 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750546 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750550 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750553 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750556 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750560 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750563 2576 flags.go:64] FLAG: --feature-gates="" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750572 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750575 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750578 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 07:11:04.754053 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750582 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750585 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750588 2576 flags.go:64] FLAG: --help="false" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750591 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-139-104.ec2.internal" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750594 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750597 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750600 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750604 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750608 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750611 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750614 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750617 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750620 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750622 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750625 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750628 2576 flags.go:64] FLAG: --kube-reserved="" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750631 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750634 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750637 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750640 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750642 2576 flags.go:64] FLAG: --lock-file="" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750645 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750648 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750651 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 07:11:04.754685 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750656 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750659 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750663 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750668 2576 flags.go:64] FLAG: --logging-format="text" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750671 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750674 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750677 2576 flags.go:64] FLAG: --manifest-url="" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750680 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750685 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750688 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750692 2576 flags.go:64] FLAG: --max-pods="110" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750695 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750698 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750701 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750704 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750707 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750711 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750714 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750722 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750725 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750728 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750731 2576 flags.go:64] FLAG: --pod-cidr="" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750734 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 07:11:04.755305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750741 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750747 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750750 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750753 2576 flags.go:64] FLAG: --port="10250" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750756 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750759 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03fddf5202c1ad7a3" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750762 2576 flags.go:64] FLAG: --qos-reserved="" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750765 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750768 2576 flags.go:64] FLAG: --register-node="true" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750771 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750774 2576 flags.go:64] FLAG: --register-with-taints="" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750778 2576 flags.go:64] FLAG: --registry-burst="10" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750781 2576 flags.go:64] FLAG: --registry-qps="5" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750785 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750788 2576 flags.go:64] FLAG: --reserved-memory="" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750792 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750795 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750797 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750800 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750803 2576 flags.go:64] FLAG: --runonce="false" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750806 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750809 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750812 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750815 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750818 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750821 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 07:11:04.755899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750825 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750828 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750830 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750833 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750836 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750839 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750842 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750848 2576 flags.go:64] FLAG: --system-cgroups="" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750851 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750856 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750859 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750862 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750866 2576 flags.go:64] FLAG: --tls-min-version="" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750869 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750872 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750875 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750878 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750881 2576 flags.go:64] FLAG: --v="2" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750886 2576 flags.go:64] FLAG: --version="false" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750891 2576 flags.go:64] FLAG: --vmodule="" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750896 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.750899 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750993 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750996 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:11:04.756562 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.750999 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751002 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751005 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751009 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751012 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751015 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751018 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751021 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751023 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751026 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751029 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751032 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751034 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751037 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751039 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751043 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751046 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751048 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751051 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:11:04.757192 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751053 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751056 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751058 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751061 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751063 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751066 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751068 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751071 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751075 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751078 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751081 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751083 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751086 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751089 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751091 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751094 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751097 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751099 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751102 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751104 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:11:04.757840 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751109 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751112 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751115 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751119 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751121 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751124 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751127 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751129 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751133 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751135 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751138 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751141 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751144 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751146 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751149 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751151 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751154 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751157 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751159 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:11:04.758687 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751161 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751166 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751168 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751171 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751173 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751176 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751179 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751181 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751184 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751186 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751189 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751191 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751194 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751197 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751199 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751202 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751204 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751207 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751210 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751212 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:11:04.759313 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751215 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:11:04.759845 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751218 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:11:04.759845 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751221 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:11:04.759845 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751223 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:11:04.759845 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751225 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:11:04.759845 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.751228 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:11:04.759845 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.751838 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.759911 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.759930 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.759983 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.759988 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.759991 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.759995 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.759999 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760002 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760005 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:11:04.760005 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760009 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760012 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760015 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760018 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760021 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760024 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760027 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760030 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760033 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760036 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760038 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760041 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760043 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760047 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760051 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760054 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760057 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760060 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760062 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:11:04.760275 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760065 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760067 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760070 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760072 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760075 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760077 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760080 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760083 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760086 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760088 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760091 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760093 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760096 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760099 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760101 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760105 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760108 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760112 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760115 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760118 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:11:04.760750 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760121 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760124 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760127 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760129 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760134 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760138 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760141 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760144 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760147 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760150 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760153 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760156 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760158 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760161 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760164 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760167 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760169 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760172 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760175 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760177 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:11:04.761234 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760180 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760183 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760185 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760188 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760190 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760193 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760196 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760199 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760202 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760205 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760208 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760211 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760213 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760216 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760218 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760221 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760224 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760226 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760229 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:11:04.761836 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760231 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.760237 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760351 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760357 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760360 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760363 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760366 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760368 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760371 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760374 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760376 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760379 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760382 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760384 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760387 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760389 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:11:04.762376 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760392 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760394 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760397 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760399 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760402 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760406 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760408 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760412 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760415 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760418 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760421 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760423 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760425 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760428 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760431 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760435 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760438 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760441 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760444 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:11:04.762775 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760446 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760449 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760452 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760454 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760457 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760459 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760462 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760464 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760467 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760469 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760471 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760474 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760477 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760479 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760481 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760484 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760486 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760490 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760493 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760496 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:11:04.763240 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760499 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760501 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760505 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760507 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760510 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760512 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760515 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760517 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760520 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760522 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760525 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760527 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760530 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760532 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760535 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760537 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760540 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760542 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760545 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760548 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:11:04.763747 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760550 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760553 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760555 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760558 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760560 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760562 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760565 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760567 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760570 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760572 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760576 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760580 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:04.760582 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.760588 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:11:04.764233 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.761193 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 07:11:04.764611 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.764595 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 07:11:04.765605 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.765593 2576 server.go:1019] "Starting client certificate rotation" Apr 21 07:11:04.765716 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.765697 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:11:04.765757 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.765745 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:11:04.787198 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.787175 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:11:04.789942 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.789913 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:11:04.806706 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.806683 2576 log.go:25] "Validated CRI v1 runtime API" Apr 21 07:11:04.812048 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.812031 2576 log.go:25] "Validated CRI v1 image API" Apr 21 07:11:04.813184 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.813167 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 07:11:04.819970 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.819944 2576 fs.go:135] Filesystem UUIDs: map[56db7b68-b77c-47ed-bf64-5df0b3444a86:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 81b86db3-5ab3-4ee8-8d92-134e4bb359ab:/dev/nvme0n1p3] Apr 21 07:11:04.820030 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.819970 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 07:11:04.822331 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.822309 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:11:04.827780 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.827669 2576 manager.go:217] Machine: {Timestamp:2026-04-21 07:11:04.824947821 +0000 UTC m=+0.362834755 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3107841 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a1fed0076c90ebd3dabfb6cfd14ad SystemUUID:ec2a1fed-0076-c90e-bd3d-abfb6cfd14ad BootID:b304a3ce-5a6e-4fc0-9353-bf77f75682f1 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7e:c3:25:74:2f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7e:c3:25:74:2f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:13:60:b6:d6:93 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 07:11:04.827780 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.827773 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 07:11:04.827904 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.827864 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 07:11:04.828864 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.828840 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 07:11:04.829000 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.828867 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-104.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 07:11:04.829045 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.829010 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 07:11:04.829045 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.829018 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 07:11:04.829045 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.829031 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:11:04.829718 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.829708 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:11:04.830818 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.830807 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:11:04.830931 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.830922 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 07:11:04.833031 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.833021 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 21 07:11:04.833070 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.833035 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 07:11:04.833070 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.833047 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 07:11:04.833070 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.833055 2576 kubelet.go:397] "Adding apiserver pod source" Apr 21 07:11:04.833070 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.833065 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 07:11:04.834040 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.834026 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:11:04.834116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.834043 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:11:04.837154 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.837137 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 07:11:04.838624 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.838610 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 07:11:04.839820 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839801 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 07:11:04.839820 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839821 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839829 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839838 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839848 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839856 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839862 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839867 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839875 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839881 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839891 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 07:11:04.839935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.839900 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 07:11:04.841525 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.841513 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 07:11:04.841525 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.841524 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 07:11:04.845200 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.845170 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-104.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 07:11:04.845290 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.845271 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-104.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 07:11:04.845290 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.845282 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 07:11:04.845394 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.845316 2576 server.go:1295] "Started kubelet" Apr 21 07:11:04.845438 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.845385 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 07:11:04.845539 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.845475 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 07:11:04.845573 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.845513 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 07:11:04.845604 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.845590 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 07:11:04.846288 ip-10-0-139-104 systemd[1]: Started Kubernetes Kubelet. Apr 21 07:11:04.847007 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.846993 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 21 07:11:04.847512 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.847495 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 07:11:04.848593 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.848575 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nm5fs" Apr 21 07:11:04.851675 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.850743 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-104.ec2.internal.18a84dae7ed673c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-104.ec2.internal,UID:ip-10-0-139-104.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-104.ec2.internal,},FirstTimestamp:2026-04-21 07:11:04.845292486 +0000 UTC m=+0.383179401,LastTimestamp:2026-04-21 07:11:04.845292486 +0000 UTC m=+0.383179401,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-104.ec2.internal,}" Apr 21 07:11:04.852690 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.852673 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 07:11:04.853192 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.853172 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 07:11:04.853713 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.853695 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 07:11:04.853713 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.853716 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 07:11:04.853870 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.853820 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 07:11:04.853870 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.853856 2576 factory.go:55] Registering systemd factory Apr 21 07:11:04.853968 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.853908 2576 factory.go:223] Registration of the systemd container factory successfully Apr 21 07:11:04.853968 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.853863 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 21 07:11:04.853968 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.853950 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 21 07:11:04.854149 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.854118 2576 factory.go:153] Registering CRI-O factory Apr 21 07:11:04.854149 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.854140 2576 factory.go:223] Registration of the crio container factory successfully Apr 21 07:11:04.854286 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.854202 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 07:11:04.854286 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.854226 2576 factory.go:103] Registering Raw factory Apr 21 07:11:04.854286 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.854243 2576 manager.go:1196] Started watching for new ooms in manager Apr 21 07:11:04.854429 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.854369 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:04.855450 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.855429 2576 manager.go:319] Starting recovery of all containers Apr 21 07:11:04.856021 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.855999 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 07:11:04.857406 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.857381 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 07:11:04.857675 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.857613 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-104.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 07:11:04.857802 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.857782 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nm5fs" Apr 21 07:11:04.866898 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.866749 2576 manager.go:324] Recovery completed Apr 21 07:11:04.871220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.871206 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:11:04.873565 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.873548 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:11:04.873623 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.873578 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:11:04.873623 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.873589 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:11:04.874111 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.874096 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 07:11:04.874111 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.874110 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 07:11:04.874237 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.874128 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:11:04.876390 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.876376 2576 policy_none.go:49] "None policy: Start" Apr 21 07:11:04.876445 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.876395 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 07:11:04.876445 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.876411 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.912503 2576 manager.go:341] "Starting Device Plugin manager" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.912546 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.912559 2576 server.go:85] "Starting device plugin registration server" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.912805 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.912818 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.912935 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.913047 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.913059 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.913567 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 07:11:04.921017 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.913608 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:04.984140 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.984107 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 07:11:04.985314 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.985294 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 07:11:04.985379 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.985326 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 07:11:04.985379 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.985349 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 07:11:04.985379 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.985357 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 07:11:04.985486 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:04.985403 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 07:11:04.990804 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:04.990788 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:11:05.013070 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.013014 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:11:05.014330 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.014317 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:11:05.014431 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.014344 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:11:05.014431 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.014356 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:11:05.014431 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.014379 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.024336 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.024312 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.024336 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.024336 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-104.ec2.internal\": node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.047274 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.047233 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.086533 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.086481 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal"] Apr 21 07:11:05.086689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.086588 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:11:05.089988 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.089968 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:11:05.090098 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.090002 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:11:05.090098 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.090019 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:11:05.091323 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.091308 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:11:05.091407 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.091386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.091455 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.091420 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:11:05.092577 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.092562 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:11:05.092645 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.092592 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:11:05.092645 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.092568 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:11:05.092645 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.092607 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:11:05.092645 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.092620 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:11:05.092645 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.092633 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:11:05.093874 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.093857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.093946 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.093881 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:11:05.094584 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.094570 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:11:05.094666 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.094606 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:11:05.094666 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.094643 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:11:05.119105 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.119075 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-104.ec2.internal\" not found" node="ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.123604 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.123585 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-104.ec2.internal\" not found" node="ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.147933 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.147905 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.156347 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.156321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ef588ca821884d2680061f64d3ed09f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal\" (UID: \"5ef588ca821884d2680061f64d3ed09f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.156417 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.156350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3c93887b2602d2e26b42ce7ba4f7f773-config\") pod \"kube-apiserver-proxy-ip-10-0-139-104.ec2.internal\" (UID: \"3c93887b2602d2e26b42ce7ba4f7f773\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.156417 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.156374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5ef588ca821884d2680061f64d3ed09f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal\" (UID: \"5ef588ca821884d2680061f64d3ed09f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.248849 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.248816 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.257344 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.257324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3c93887b2602d2e26b42ce7ba4f7f773-config\") pod \"kube-apiserver-proxy-ip-10-0-139-104.ec2.internal\" (UID: \"3c93887b2602d2e26b42ce7ba4f7f773\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.257422 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.257355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5ef588ca821884d2680061f64d3ed09f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal\" (UID: \"5ef588ca821884d2680061f64d3ed09f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.257422 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.257373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ef588ca821884d2680061f64d3ed09f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal\" (UID: \"5ef588ca821884d2680061f64d3ed09f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.257488 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.257418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3c93887b2602d2e26b42ce7ba4f7f773-config\") pod \"kube-apiserver-proxy-ip-10-0-139-104.ec2.internal\" (UID: \"3c93887b2602d2e26b42ce7ba4f7f773\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.257529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.257486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ef588ca821884d2680061f64d3ed09f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal\" (UID: \"5ef588ca821884d2680061f64d3ed09f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.257565 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.257527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5ef588ca821884d2680061f64d3ed09f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal\" (UID: \"5ef588ca821884d2680061f64d3ed09f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.349204 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.349123 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.420658 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.420617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.425001 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.424984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal" Apr 21 07:11:05.449571 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.449538 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.550159 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.550129 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.650664 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.650572 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.751319 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.751283 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.765778 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.765749 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 07:11:05.765916 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.765900 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:11:05.851973 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.851935 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.853027 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.853010 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 07:11:05.860003 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.859972 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 07:06:04 +0000 UTC" deadline="2027-12-20 00:20:51.717477141 +0000 UTC" Apr 21 07:11:05.860003 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.859999 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14585h9m45.857481166s" Apr 21 07:11:05.867927 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.867896 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:11:05.913629 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.913559 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6lt7v" Apr 21 07:11:05.925232 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.925204 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6lt7v" Apr 21 07:11:05.948024 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:05.947963 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef588ca821884d2680061f64d3ed09f.slice/crio-bbf4c6f51338afd29309689d2eb160ee0b6e72a5f4b2a037c2e6e8ebc6e41516 WatchSource:0}: Error finding container bbf4c6f51338afd29309689d2eb160ee0b6e72a5f4b2a037c2e6e8ebc6e41516: Status 404 returned error can't find the container with id bbf4c6f51338afd29309689d2eb160ee0b6e72a5f4b2a037c2e6e8ebc6e41516 Apr 21 07:11:05.952115 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:05.952095 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-104.ec2.internal\" not found" Apr 21 07:11:05.953247 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.953230 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:11:05.973112 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.973082 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:11:05.989287 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.989224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" event={"ID":"5ef588ca821884d2680061f64d3ed09f","Type":"ContainerStarted","Data":"bbf4c6f51338afd29309689d2eb160ee0b6e72a5f4b2a037c2e6e8ebc6e41516"} Apr 21 07:11:05.989741 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.989719 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:11:05.990392 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:05.990371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal" event={"ID":"3c93887b2602d2e26b42ce7ba4f7f773","Type":"ContainerStarted","Data":"50e38ff098dbdd581ffc30d9d440a7bcd3de716ad27b63d6f2b1bf8cdc631476"} Apr 21 07:11:06.053678 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.053649 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" Apr 21 07:11:06.066312 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.066286 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:11:06.068155 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.068134 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal" Apr 21 07:11:06.078520 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.078492 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:11:06.117613 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.117574 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:11:06.759488 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.759457 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:11:06.834153 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.834118 2576 apiserver.go:52] "Watching apiserver" Apr 21 07:11:06.843739 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.843710 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 07:11:06.844780 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.844757 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-f8dzv","openshift-cluster-node-tuning-operator/tuned-8w45p","openshift-dns/node-resolver-nr6p6","openshift-image-registry/node-ca-v9k64","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal","openshift-multus/multus-additional-cni-plugins-vzbdr","openshift-multus/multus-r8m6x","openshift-multus/network-metrics-daemon-fqxn4","kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss","openshift-network-diagnostics/network-check-target-zv2g2","openshift-network-operator/iptables-alerter-lccf5","openshift-ovn-kubernetes/ovnkube-node-hmsxs"] Apr 21 07:11:06.849205 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.849178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.851280 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.851241 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.852091 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.852032 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 07:11:06.852091 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.852056 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 07:11:06.852091 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.852069 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 07:11:06.852315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.852094 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-z6d2x\"" Apr 21 07:11:06.852433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.852400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 07:11:06.853543 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.853482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.853691 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.853571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.854419 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.854118 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vhdzg\"" Apr 21 07:11:06.854419 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.854217 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 07:11:06.854419 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.854216 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:11:06.855777 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.855760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.856683 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.856663 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 07:11:06.857595 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.857568 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qpdtf\"" Apr 21 07:11:06.858024 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.858007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:06.859146 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.859121 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 07:11:06.859318 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.859292 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 07:11:06.859717 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.859697 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 07:11:06.859806 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.859789 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 07:11:06.860024 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.860002 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7v8lt\"" Apr 21 07:11:06.860508 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.860410 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 07:11:06.860657 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.860640 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 07:11:06.860887 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.860869 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5zf2r\"" Apr 21 07:11:06.863743 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.863724 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:06.863842 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.863810 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:06.863842 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:06.863808 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:06.864975 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.864953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-sysctl-conf\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.865068 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.864988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-run\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.865068 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-tuned\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.865068 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5aed744-8c66-4da8-b412-288b462f285b-tmp-dir\") pod \"node-resolver-nr6p6\" (UID: \"c5aed744-8c66-4da8-b412-288b462f285b\") " pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.865068 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66-serviceca\") pod \"node-ca-v9k64\" (UID: \"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66\") " pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.865068 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-systemd\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.865372 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66-host\") pod \"node-ca-v9k64\" (UID: \"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66\") " pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.865372 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpj6r\" (UniqueName: \"kubernetes.io/projected/4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66-kube-api-access-kpj6r\") pod \"node-ca-v9k64\" (UID: \"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66\") " pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.865372 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-system-cni-dir\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865372 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-cnibin\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865372 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-os-release\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865372 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-run-netns\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865372 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-hostroot\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865372 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-sys\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.865372 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-system-cni-dir\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-os-release\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e12015e6-2082-4f37-be78-ba178fd7beec-cni-binary-copy\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865455 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dec13d58-9abe-4cbd-a479-45ceea3970a9-cni-binary-copy\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-var-lib-cni-bin\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-conf-dir\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865554 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-run-multus-certs\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-lib-modules\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-host\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865664 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cml49\" (UniqueName: \"kubernetes.io/projected/e12015e6-2082-4f37-be78-ba178fd7beec-kube-api-access-cml49\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-run-k8s-cni-cncf-io\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865692 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gwksx\"" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j7b6\" (UniqueName: \"kubernetes.io/projected/dec13d58-9abe-4cbd-a479-45ceea3970a9-kube-api-access-4j7b6\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzjkw\" (UniqueName: \"kubernetes.io/projected/13d3ab13-d431-4545-bc03-50c6840b6f39-kube-api-access-pzjkw\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.865786 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-cnibin\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e12015e6-2082-4f37-be78-ba178fd7beec-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-kubernetes\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c5aed744-8c66-4da8-b412-288b462f285b-hosts-file\") pod \"node-resolver-nr6p6\" (UID: \"c5aed744-8c66-4da8-b412-288b462f285b\") " pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-var-lib-cni-multus\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-etc-kubernetes\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-modprobe-d\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.865997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/13d3ab13-d431-4545-bc03-50c6840b6f39-tmp\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-cni-dir\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-socket-dir-parent\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-var-lib-kubelet\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-daemon-config\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-sysconfig\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-var-lib-kubelet\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:06.866529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqtzx\" (UniqueName: \"kubernetes.io/projected/c5aed744-8c66-4da8-b412-288b462f285b-kube-api-access-qqtzx\") pod \"node-resolver-nr6p6\" (UID: \"c5aed744-8c66-4da8-b412-288b462f285b\") " pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.867371 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:06.866310 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:06.867371 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e12015e6-2082-4f37-be78-ba178fd7beec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.867371 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-sysctl-d\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.867371 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866435 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 07:11:06.867371 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.866864 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 07:11:06.867617 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.867531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gwqpq\"" Apr 21 07:11:06.867617 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.867576 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 07:11:06.868573 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.868551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:06.871215 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.871186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.871867 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.871457 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:11:06.871867 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.871559 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sw4cm\"" Apr 21 07:11:06.871867 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.871735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 07:11:06.872079 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.871873 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 07:11:06.874233 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.874211 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 07:11:06.874383 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.874211 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 07:11:06.875074 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.875049 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 07:11:06.875178 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.875059 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 07:11:06.875178 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.875139 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 07:11:06.875178 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.875156 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 07:11:06.875615 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.875584 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4l8sd\"" Apr 21 07:11:06.926247 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.926197 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:06:05 +0000 UTC" deadline="2027-10-19 22:13:34.741635713 +0000 UTC" Apr 21 07:11:06.926247 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.926240 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13119h2m27.815399149s" Apr 21 07:11:06.955318 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.955286 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 07:11:06.967120 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-sysconfig\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.967299 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqtzx\" (UniqueName: \"kubernetes.io/projected/c5aed744-8c66-4da8-b412-288b462f285b-kube-api-access-qqtzx\") pod \"node-resolver-nr6p6\" (UID: \"c5aed744-8c66-4da8-b412-288b462f285b\") " pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.967299 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:06.967299 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-sysconfig\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.967299 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-kubelet\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.967489 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-systemd-units\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.967489 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t95bn\" (UniqueName: \"kubernetes.io/projected/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-kube-api-access-t95bn\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.967489 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-sysctl-d\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.967489 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-run\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.967489 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5aed744-8c66-4da8-b412-288b462f285b-tmp-dir\") pod \"node-resolver-nr6p6\" (UID: \"c5aed744-8c66-4da8-b412-288b462f285b\") " pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.967489 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66-serviceca\") pod \"node-ca-v9k64\" (UID: \"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66\") " pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.967745 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-run-systemd\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.967745 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-run-ovn\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.967745 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.967583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-sysctl-d\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.968084 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-os-release\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.968163 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-run-netns\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.968163 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-os-release\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.968276 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-hostroot\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.968276 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968182 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5aed744-8c66-4da8-b412-288b462f285b-tmp-dir\") pod \"node-resolver-nr6p6\" (UID: \"c5aed744-8c66-4da8-b412-288b462f285b\") " pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.968276 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-socket-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:06.968276 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-run\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.968447 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-cni-netd\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.968447 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.968447 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-ovnkube-config\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.968447 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-run-netns\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.968447 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-sys\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.968659 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66-serviceca\") pod \"node-ca-v9k64\" (UID: \"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66\") " pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.968659 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-hostroot\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.968659 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-sys\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.968788 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dec13d58-9abe-4cbd-a479-45ceea3970a9-cni-binary-copy\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.968788 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-etc-selinux\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:06.968877 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.968786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-lib-modules\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.969306 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-run-k8s-cni-cncf-io\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.969398 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzjkw\" (UniqueName: \"kubernetes.io/projected/13d3ab13-d431-4545-bc03-50c6840b6f39-kube-api-access-pzjkw\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.969449 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7def839-f2b6-4ceb-9338-57bbb74327a3-host-slash\") pod \"iptables-alerter-lccf5\" (UID: \"b7def839-f2b6-4ceb-9338-57bbb74327a3\") " pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:06.969494 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-slash\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.969539 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-run-netns\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.969603 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-ovn-node-metrics-cert\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.969656 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-lib-modules\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.969656 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-kubernetes\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.969739 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-run-k8s-cni-cncf-io\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.969739 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c5aed744-8c66-4da8-b412-288b462f285b-hosts-file\") pod \"node-resolver-nr6p6\" (UID: \"c5aed744-8c66-4da8-b412-288b462f285b\") " pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.970015 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.969991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-kubernetes\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.970088 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.970088 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-etc-kubernetes\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.970177 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c5aed744-8c66-4da8-b412-288b462f285b-hosts-file\") pod \"node-resolver-nr6p6\" (UID: \"c5aed744-8c66-4da8-b412-288b462f285b\") " pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.970244 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-log-socket\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.970308 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-etc-kubernetes\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.970363 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/13d3ab13-d431-4545-bc03-50c6840b6f39-tmp\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.970363 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dec13d58-9abe-4cbd-a479-45ceea3970a9-cni-binary-copy\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.970444 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-socket-dir-parent\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.970444 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.970529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-var-lib-kubelet\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.970578 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-socket-dir-parent\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.970623 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-daemon-config\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.970669 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970619 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-var-lib-kubelet\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.970716 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:06.970716 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970690 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 07:11:06.970826 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-var-lib-kubelet\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.970874 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e12015e6-2082-4f37-be78-ba178fd7beec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.970941 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/554c2ac8-64c6-4da1-80ab-4059aee84a3e-agent-certs\") pod \"konnectivity-agent-f8dzv\" (UID: \"554c2ac8-64c6-4da1-80ab-4059aee84a3e\") " pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:06.970990 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.970962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wpcf\" (UniqueName: \"kubernetes.io/projected/61710589-be37-470a-8046-39c730b38313-kube-api-access-4wpcf\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:06.971041 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:06.971086 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-run-openvswitch\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.971220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-daemon-config\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.971288 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-env-overrides\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.971343 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-var-lib-kubelet\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.971343 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-sysctl-conf\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.971343 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-tuned\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.971469 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-registration-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:06.971469 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-device-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:06.971552 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b7def839-f2b6-4ceb-9338-57bbb74327a3-iptables-alerter-script\") pod \"iptables-alerter-lccf5\" (UID: \"b7def839-f2b6-4ceb-9338-57bbb74327a3\") " pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:06.971599 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-etc-openvswitch\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.971643 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-systemd\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.971692 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66-host\") pod \"node-ca-v9k64\" (UID: \"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66\") " pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.971692 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpj6r\" (UniqueName: \"kubernetes.io/projected/4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66-kube-api-access-kpj6r\") pod \"node-ca-v9k64\" (UID: \"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66\") " pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.971778 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-system-cni-dir\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.971824 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-cnibin\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.971824 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e12015e6-2082-4f37-be78-ba178fd7beec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.972029 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-systemd\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.972029 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-cnibin\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.972029 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.971998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66-host\") pod \"node-ca-v9k64\" (UID: \"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66\") " pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.972237 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-system-cni-dir\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.972237 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-sysctl-conf\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.972344 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-run-multus-certs\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.972344 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxqtl\" (UniqueName: \"kubernetes.io/projected/b7def839-f2b6-4ceb-9338-57bbb74327a3-kube-api-access-sxqtl\") pod \"iptables-alerter-lccf5\" (UID: \"b7def839-f2b6-4ceb-9338-57bbb74327a3\") " pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:06.972429 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-system-cni-dir\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.972429 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-os-release\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.972519 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-run-multus-certs\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.972519 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-system-cni-dir\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.972613 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e12015e6-2082-4f37-be78-ba178fd7beec-cni-binary-copy\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.972613 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-var-lib-cni-bin\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.972704 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-conf-dir\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.972704 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-cni-bin\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.972704 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-host\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.972841 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-var-lib-cni-bin\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.972841 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cml49\" (UniqueName: \"kubernetes.io/projected/e12015e6-2082-4f37-be78-ba178fd7beec-kube-api-access-cml49\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.972841 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-conf-dir\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.972841 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-host\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.972841 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4j7b6\" (UniqueName: \"kubernetes.io/projected/dec13d58-9abe-4cbd-a479-45ceea3970a9-kube-api-access-4j7b6\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.972841 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-os-release\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.973112 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-cnibin\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.973112 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e12015e6-2082-4f37-be78-ba178fd7beec-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.973112 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.972987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-node-log\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.973112 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-var-lib-cni-multus\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.973112 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/554c2ac8-64c6-4da1-80ab-4059aee84a3e-konnectivity-ca\") pod \"konnectivity-agent-f8dzv\" (UID: \"554c2ac8-64c6-4da1-80ab-4059aee84a3e\") " pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:06.973364 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-modprobe-d\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.973364 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e12015e6-2082-4f37-be78-ba178fd7beec-cni-binary-copy\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.973364 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-cni-dir\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.973364 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-host-var-lib-cni-multus\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.973364 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-modprobe-d\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.973364 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dec13d58-9abe-4cbd-a479-45ceea3970a9-multus-cni-dir\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.973364 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e12015e6-2082-4f37-be78-ba178fd7beec-cnibin\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.973707 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e12015e6-2082-4f37-be78-ba178fd7beec-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:06.973780 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-sys-fs\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:06.973925 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.973806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9zsm\" (UniqueName: \"kubernetes.io/projected/fad82241-109b-4dde-923e-45ecd4be2d96-kube-api-access-w9zsm\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:06.974385 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.974367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-var-lib-openvswitch\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.974537 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.974519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.974663 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.974634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-ovnkube-script-lib\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:06.975164 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.975145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/13d3ab13-d431-4545-bc03-50c6840b6f39-etc-tuned\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.975567 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.975548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/13d3ab13-d431-4545-bc03-50c6840b6f39-tmp\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.977511 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.977490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqtzx\" (UniqueName: \"kubernetes.io/projected/c5aed744-8c66-4da8-b412-288b462f285b-kube-api-access-qqtzx\") pod \"node-resolver-nr6p6\" (UID: \"c5aed744-8c66-4da8-b412-288b462f285b\") " pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:06.978959 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.978938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzjkw\" (UniqueName: \"kubernetes.io/projected/13d3ab13-d431-4545-bc03-50c6840b6f39-kube-api-access-pzjkw\") pod \"tuned-8w45p\" (UID: \"13d3ab13-d431-4545-bc03-50c6840b6f39\") " pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:06.987971 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.987943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpj6r\" (UniqueName: \"kubernetes.io/projected/4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66-kube-api-access-kpj6r\") pod \"node-ca-v9k64\" (UID: \"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66\") " pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:06.988075 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.987993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j7b6\" (UniqueName: \"kubernetes.io/projected/dec13d58-9abe-4cbd-a479-45ceea3970a9-kube-api-access-4j7b6\") pod \"multus-r8m6x\" (UID: \"dec13d58-9abe-4cbd-a479-45ceea3970a9\") " pod="openshift-multus/multus-r8m6x" Apr 21 07:11:06.988401 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:06.988379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cml49\" (UniqueName: \"kubernetes.io/projected/e12015e6-2082-4f37-be78-ba178fd7beec-kube-api-access-cml49\") pod \"multus-additional-cni-plugins-vzbdr\" (UID: \"e12015e6-2082-4f37-be78-ba178fd7beec\") " pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:07.075466 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7def839-f2b6-4ceb-9338-57bbb74327a3-host-slash\") pod \"iptables-alerter-lccf5\" (UID: \"b7def839-f2b6-4ceb-9338-57bbb74327a3\") " pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:07.075466 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-slash\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.075466 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-run-netns\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-slash\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7def839-f2b6-4ceb-9338-57bbb74327a3-host-slash\") pod \"iptables-alerter-lccf5\" (UID: \"b7def839-f2b6-4ceb-9338-57bbb74327a3\") " pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-ovn-node-metrics-cert\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-run-netns\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-log-socket\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/554c2ac8-64c6-4da1-80ab-4059aee84a3e-agent-certs\") pod \"konnectivity-agent-f8dzv\" (UID: \"554c2ac8-64c6-4da1-80ab-4059aee84a3e\") " pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-log-socket\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpcf\" (UniqueName: \"kubernetes.io/projected/61710589-be37-470a-8046-39c730b38313-kube-api-access-4wpcf\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:07.075689 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.075678 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.075740 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs podName:61710589-be37-470a-8046-39c730b38313 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:07.575715236 +0000 UTC m=+3.113602154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs") pod "network-metrics-daemon-fqxn4" (UID: "61710589-be37-470a-8046-39c730b38313") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-run-openvswitch\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-env-overrides\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-run-openvswitch\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-registration-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075871 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-registration-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-device-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b7def839-f2b6-4ceb-9338-57bbb74327a3-iptables-alerter-script\") pod \"iptables-alerter-lccf5\" (UID: \"b7def839-f2b6-4ceb-9338-57bbb74327a3\") " pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-device-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-etc-openvswitch\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.075991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqtl\" (UniqueName: \"kubernetes.io/projected/b7def839-f2b6-4ceb-9338-57bbb74327a3-kube-api-access-sxqtl\") pod \"iptables-alerter-lccf5\" (UID: \"b7def839-f2b6-4ceb-9338-57bbb74327a3\") " pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-cni-bin\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-node-log\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-node-log\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/554c2ac8-64c6-4da1-80ab-4059aee84a3e-konnectivity-ca\") pod \"konnectivity-agent-f8dzv\" (UID: \"554c2ac8-64c6-4da1-80ab-4059aee84a3e\") " pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-sys-fs\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9zsm\" (UniqueName: \"kubernetes.io/projected/fad82241-109b-4dde-923e-45ecd4be2d96-kube-api-access-w9zsm\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-var-lib-openvswitch\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-env-overrides\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-ovnkube-script-lib\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-sys-fs\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-kubelet\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-systemd-units\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-cni-bin\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t95bn\" (UniqueName: \"kubernetes.io/projected/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-kube-api-access-t95bn\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076468 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b7def839-f2b6-4ceb-9338-57bbb74327a3-iptables-alerter-script\") pod \"iptables-alerter-lccf5\" (UID: \"b7def839-f2b6-4ceb-9338-57bbb74327a3\") " pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-etc-openvswitch\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-run-systemd\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.076670 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-run-ovn\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-run-systemd\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-var-lib-openvswitch\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-systemd-units\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076602 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-run-ovn\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-kubelet\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-socket-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-cni-netd\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-ovnkube-config\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-etc-selinux\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-cni-netd\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.076974 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-socket-dir\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.077050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/554c2ac8-64c6-4da1-80ab-4059aee84a3e-konnectivity-ca\") pod \"konnectivity-agent-f8dzv\" (UID: \"554c2ac8-64c6-4da1-80ab-4059aee84a3e\") " pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:07.077315 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.077091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-ovnkube-script-lib\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.077824 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.077110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fad82241-109b-4dde-923e-45ecd4be2d96-etc-selinux\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.077824 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.077457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-ovnkube-config\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.078416 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.078391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/554c2ac8-64c6-4da1-80ab-4059aee84a3e-agent-certs\") pod \"konnectivity-agent-f8dzv\" (UID: \"554c2ac8-64c6-4da1-80ab-4059aee84a3e\") " pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:07.078680 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.078657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-ovn-node-metrics-cert\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.086689 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.086668 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:07.086689 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.086687 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:07.086854 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.086698 2576 projected.go:194] Error preparing data for projected volume kube-api-access-926lj for pod openshift-network-diagnostics/network-check-target-zv2g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:07.086854 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.086788 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj podName:53e6ffe6-1b54-4a2a-8aa1-0a1d310df973 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:07.586774478 +0000 UTC m=+3.124661380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-926lj" (UniqueName: "kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj") pod "network-check-target-zv2g2" (UID: "53e6ffe6-1b54-4a2a-8aa1-0a1d310df973") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:07.090841 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.090812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t95bn\" (UniqueName: \"kubernetes.io/projected/e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a-kube-api-access-t95bn\") pod \"ovnkube-node-hmsxs\" (UID: \"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.091236 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.091212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wpcf\" (UniqueName: \"kubernetes.io/projected/61710589-be37-470a-8046-39c730b38313-kube-api-access-4wpcf\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:07.091534 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.091513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqtl\" (UniqueName: \"kubernetes.io/projected/b7def839-f2b6-4ceb-9338-57bbb74327a3-kube-api-access-sxqtl\") pod \"iptables-alerter-lccf5\" (UID: \"b7def839-f2b6-4ceb-9338-57bbb74327a3\") " pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:07.091625 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.091609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9zsm\" (UniqueName: \"kubernetes.io/projected/fad82241-109b-4dde-923e-45ecd4be2d96-kube-api-access-w9zsm\") pod \"aws-ebs-csi-driver-node-ln5ss\" (UID: \"fad82241-109b-4dde-923e-45ecd4be2d96\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.162102 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.162059 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r8m6x" Apr 21 07:11:07.168005 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.167965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8w45p" Apr 21 07:11:07.177660 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.177635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nr6p6" Apr 21 07:11:07.185336 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.185311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v9k64" Apr 21 07:11:07.191946 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.191922 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" Apr 21 07:11:07.200606 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.200578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:07.207338 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.207311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" Apr 21 07:11:07.216023 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.215996 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lccf5" Apr 21 07:11:07.221734 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.221712 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:07.580152 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.580119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:07.580360 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.580319 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:07.580430 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.580392 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs podName:61710589-be37-470a-8046-39c730b38313 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:08.580372747 +0000 UTC m=+4.118259648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs") pod "network-metrics-daemon-fqxn4" (UID: "61710589-be37-470a-8046-39c730b38313") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:07.654671 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:07.654629 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad82241_109b_4dde_923e_45ecd4be2d96.slice/crio-03de4c3172a38d0d576da7107d41fab7566c9e3bc32e51eaf5402f3a6060069b WatchSource:0}: Error finding container 03de4c3172a38d0d576da7107d41fab7566c9e3bc32e51eaf5402f3a6060069b: Status 404 returned error can't find the container with id 03de4c3172a38d0d576da7107d41fab7566c9e3bc32e51eaf5402f3a6060069b Apr 21 07:11:07.656145 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:07.656122 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7def839_f2b6_4ceb_9338_57bbb74327a3.slice/crio-9733044367ed3164a59bf33aba0b8bf0e4ecba32548ae48b93dce2390650ac37 WatchSource:0}: Error finding container 9733044367ed3164a59bf33aba0b8bf0e4ecba32548ae48b93dce2390650ac37: Status 404 returned error can't find the container with id 9733044367ed3164a59bf33aba0b8bf0e4ecba32548ae48b93dce2390650ac37 Apr 21 07:11:07.657505 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:07.657479 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod554c2ac8_64c6_4da1_80ab_4059aee84a3e.slice/crio-1cb6e2ff0f1b7fc4131ab5f77335f58bb3c9cf0cfb3149a5eecbecd46de03eeb WatchSource:0}: Error finding container 1cb6e2ff0f1b7fc4131ab5f77335f58bb3c9cf0cfb3149a5eecbecd46de03eeb: Status 404 returned error can't find the container with id 1cb6e2ff0f1b7fc4131ab5f77335f58bb3c9cf0cfb3149a5eecbecd46de03eeb Apr 21 07:11:07.660534 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:07.660463 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12015e6_2082_4f37_be78_ba178fd7beec.slice/crio-95238bf61fff597207938b4e69f1bac6b66c77a854539022693efb354ff47267 WatchSource:0}: Error finding container 95238bf61fff597207938b4e69f1bac6b66c77a854539022693efb354ff47267: Status 404 returned error can't find the container with id 95238bf61fff597207938b4e69f1bac6b66c77a854539022693efb354ff47267 Apr 21 07:11:07.661242 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:07.661216 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5aed744_8c66_4da8_b412_288b462f285b.slice/crio-779d1dcfd8665f6dad57bf14faa11bc8e7239c8fa7de40eae00fed4f98dcc019 WatchSource:0}: Error finding container 779d1dcfd8665f6dad57bf14faa11bc8e7239c8fa7de40eae00fed4f98dcc019: Status 404 returned error can't find the container with id 779d1dcfd8665f6dad57bf14faa11bc8e7239c8fa7de40eae00fed4f98dcc019 Apr 21 07:11:07.662351 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:07.662326 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddec13d58_9abe_4cbd_a479_45ceea3970a9.slice/crio-6e2af66d624c9032da7a0ffd7eb7909f509d5e6d946393bb5a92ae873eb2a45f WatchSource:0}: Error finding container 6e2af66d624c9032da7a0ffd7eb7909f509d5e6d946393bb5a92ae873eb2a45f: Status 404 returned error can't find the container with id 6e2af66d624c9032da7a0ffd7eb7909f509d5e6d946393bb5a92ae873eb2a45f Apr 21 07:11:07.663211 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:07.663189 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b301c7f_1c6e_4bf8_ba19_c3a7fd175d66.slice/crio-5accd1cb87de439939cdef6aede95300655636bbe888c468890934abda3ae792 WatchSource:0}: Error finding container 5accd1cb87de439939cdef6aede95300655636bbe888c468890934abda3ae792: Status 404 returned error can't find the container with id 5accd1cb87de439939cdef6aede95300655636bbe888c468890934abda3ae792 Apr 21 07:11:07.664530 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:07.664502 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13d3ab13_d431_4545_bc03_50c6840b6f39.slice/crio-c14e9800595a52f75c9cf80565a113843816e7b94742a9ca384520b7fdd115fd WatchSource:0}: Error finding container c14e9800595a52f75c9cf80565a113843816e7b94742a9ca384520b7fdd115fd: Status 404 returned error can't find the container with id c14e9800595a52f75c9cf80565a113843816e7b94742a9ca384520b7fdd115fd Apr 21 07:11:07.665471 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:07.665272 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19ba7b9_b9ea_4177_a4b5_9fd6f16f010a.slice/crio-e5fbbcdbcc5a433b66b929248f4bcf3f214cf0cf93cf32048620e70d5ff764e8 WatchSource:0}: Error finding container e5fbbcdbcc5a433b66b929248f4bcf3f214cf0cf93cf32048620e70d5ff764e8: Status 404 returned error can't find the container with id e5fbbcdbcc5a433b66b929248f4bcf3f214cf0cf93cf32048620e70d5ff764e8 Apr 21 07:11:07.681240 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.681218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:07.681404 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.681347 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:07.681404 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.681360 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:07.681404 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.681369 2576 projected.go:194] Error preparing data for projected volume kube-api-access-926lj for pod openshift-network-diagnostics/network-check-target-zv2g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:07.681496 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:07.681412 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj podName:53e6ffe6-1b54-4a2a-8aa1-0a1d310df973 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:08.681397988 +0000 UTC m=+4.219284890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-926lj" (UniqueName: "kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj") pod "network-check-target-zv2g2" (UID: "53e6ffe6-1b54-4a2a-8aa1-0a1d310df973") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:07.926917 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.926669 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:06:05 +0000 UTC" deadline="2027-09-18 07:36:37.704490667 +0000 UTC" Apr 21 07:11:07.926917 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.926851 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12360h25m29.77764265s" Apr 21 07:11:07.994534 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.994495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v9k64" event={"ID":"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66","Type":"ContainerStarted","Data":"5accd1cb87de439939cdef6aede95300655636bbe888c468890934abda3ae792"} Apr 21 07:11:07.996518 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.996489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerStarted","Data":"95238bf61fff597207938b4e69f1bac6b66c77a854539022693efb354ff47267"} Apr 21 07:11:07.997864 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.997839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f8dzv" event={"ID":"554c2ac8-64c6-4da1-80ab-4059aee84a3e","Type":"ContainerStarted","Data":"1cb6e2ff0f1b7fc4131ab5f77335f58bb3c9cf0cfb3149a5eecbecd46de03eeb"} Apr 21 07:11:07.999500 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:07.999468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lccf5" event={"ID":"b7def839-f2b6-4ceb-9338-57bbb74327a3","Type":"ContainerStarted","Data":"9733044367ed3164a59bf33aba0b8bf0e4ecba32548ae48b93dce2390650ac37"} Apr 21 07:11:08.001879 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.001851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" event={"ID":"fad82241-109b-4dde-923e-45ecd4be2d96","Type":"ContainerStarted","Data":"03de4c3172a38d0d576da7107d41fab7566c9e3bc32e51eaf5402f3a6060069b"} Apr 21 07:11:08.004849 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.004794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerStarted","Data":"e5fbbcdbcc5a433b66b929248f4bcf3f214cf0cf93cf32048620e70d5ff764e8"} Apr 21 07:11:08.006477 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.006447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r8m6x" event={"ID":"dec13d58-9abe-4cbd-a479-45ceea3970a9","Type":"ContainerStarted","Data":"6e2af66d624c9032da7a0ffd7eb7909f509d5e6d946393bb5a92ae873eb2a45f"} Apr 21 07:11:08.007804 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.007779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nr6p6" event={"ID":"c5aed744-8c66-4da8-b412-288b462f285b","Type":"ContainerStarted","Data":"779d1dcfd8665f6dad57bf14faa11bc8e7239c8fa7de40eae00fed4f98dcc019"} Apr 21 07:11:08.011153 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.011097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal" event={"ID":"3c93887b2602d2e26b42ce7ba4f7f773","Type":"ContainerStarted","Data":"200a3b7ebae2d39e95a006c77e40a5b200f86261cfec4645369fb3404cfc33ba"} Apr 21 07:11:08.013452 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.013410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8w45p" event={"ID":"13d3ab13-d431-4545-bc03-50c6840b6f39","Type":"ContainerStarted","Data":"c14e9800595a52f75c9cf80565a113843816e7b94742a9ca384520b7fdd115fd"} Apr 21 07:11:08.029111 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.028739 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-104.ec2.internal" podStartSLOduration=2.028718564 podStartE2EDuration="2.028718564s" podCreationTimestamp="2026-04-21 07:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:11:08.027874554 +0000 UTC m=+3.565761478" watchObservedRunningTime="2026-04-21 07:11:08.028718564 +0000 UTC m=+3.566605488" Apr 21 07:11:08.590010 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.589919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:08.590166 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:08.590062 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:08.590166 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:08.590131 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs podName:61710589-be37-470a-8046-39c730b38313 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:10.590110334 +0000 UTC m=+6.127997237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs") pod "network-metrics-daemon-fqxn4" (UID: "61710589-be37-470a-8046-39c730b38313") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:08.691116 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.691074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:08.691378 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:08.691247 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:08.691378 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:08.691280 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:08.691378 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:08.691294 2576 projected.go:194] Error preparing data for projected volume kube-api-access-926lj for pod openshift-network-diagnostics/network-check-target-zv2g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:08.691378 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:08.691355 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj podName:53e6ffe6-1b54-4a2a-8aa1-0a1d310df973 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:10.691337661 +0000 UTC m=+6.229224569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-926lj" (UniqueName: "kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj") pod "network-check-target-zv2g2" (UID: "53e6ffe6-1b54-4a2a-8aa1-0a1d310df973") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:08.988778 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.988688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:08.989226 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:08.988813 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:08.989303 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:08.989232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:08.989763 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:08.989372 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:09.041870 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:09.041742 2576 generic.go:358] "Generic (PLEG): container finished" podID="5ef588ca821884d2680061f64d3ed09f" containerID="116f67fbaabf808663b573300ff729ba8a0101e3ba8af978633475e7bb6551d3" exitCode=0 Apr 21 07:11:09.041870 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:09.041843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" event={"ID":"5ef588ca821884d2680061f64d3ed09f","Type":"ContainerDied","Data":"116f67fbaabf808663b573300ff729ba8a0101e3ba8af978633475e7bb6551d3"} Apr 21 07:11:10.049160 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:10.048492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" event={"ID":"5ef588ca821884d2680061f64d3ed09f","Type":"ContainerStarted","Data":"1d729994b5f4f6cae3b6ad4bf929846a7cf8094ec53710e13cbb596548a122b3"} Apr 21 07:11:10.608499 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:10.608446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:10.608677 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:10.608643 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:10.608750 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:10.608706 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs podName:61710589-be37-470a-8046-39c730b38313 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:14.608688255 +0000 UTC m=+10.146575159 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs") pod "network-metrics-daemon-fqxn4" (UID: "61710589-be37-470a-8046-39c730b38313") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:10.709601 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:10.709554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:10.709793 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:10.709747 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:10.709793 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:10.709771 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:10.709888 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:10.709795 2576 projected.go:194] Error preparing data for projected volume kube-api-access-926lj for pod openshift-network-diagnostics/network-check-target-zv2g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:10.709888 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:10.709853 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj podName:53e6ffe6-1b54-4a2a-8aa1-0a1d310df973 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:14.709834574 +0000 UTC m=+10.247721480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-926lj" (UniqueName: "kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj") pod "network-check-target-zv2g2" (UID: "53e6ffe6-1b54-4a2a-8aa1-0a1d310df973") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:10.986403 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:10.986295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:10.986553 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:10.986490 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:10.986864 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:10.986323 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:10.986969 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:10.986949 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:12.986647 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:12.986094 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:12.986647 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:12.986268 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:12.986647 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:12.986329 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:12.986647 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:12.986465 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:14.619917 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.619811 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-104.ec2.internal" podStartSLOduration=8.619789812 podStartE2EDuration="8.619789812s" podCreationTimestamp="2026-04-21 07:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:11:10.068708929 +0000 UTC m=+5.606595853" watchObservedRunningTime="2026-04-21 07:11:14.619789812 +0000 UTC m=+10.157676736" Apr 21 07:11:14.620399 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.620353 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8kj2n"] Apr 21 07:11:14.623501 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.623464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:14.623631 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.623542 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:14.640525 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.640456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:14.640676 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.640616 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:14.640736 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.640689 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs podName:61710589-be37-470a-8046-39c730b38313 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:22.640669162 +0000 UTC m=+18.178556064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs") pod "network-metrics-daemon-fqxn4" (UID: "61710589-be37-470a-8046-39c730b38313") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:14.742345 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.741641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:14.742345 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.741711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62022203-bfe8-44d8-b46f-c1828de0c5a4-kubelet-config\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:14.742345 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.741741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62022203-bfe8-44d8-b46f-c1828de0c5a4-dbus\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:14.742345 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.741789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:14.742345 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.741903 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:14.742345 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.741923 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:14.742345 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.741936 2576 projected.go:194] Error preparing data for projected volume kube-api-access-926lj for pod openshift-network-diagnostics/network-check-target-zv2g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:14.742345 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.741992 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj podName:53e6ffe6-1b54-4a2a-8aa1-0a1d310df973 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:22.741974339 +0000 UTC m=+18.279861254 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-926lj" (UniqueName: "kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj") pod "network-check-target-zv2g2" (UID: "53e6ffe6-1b54-4a2a-8aa1-0a1d310df973") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:14.843568 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.843170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:14.843568 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.843232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62022203-bfe8-44d8-b46f-c1828de0c5a4-kubelet-config\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:14.843568 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.843277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62022203-bfe8-44d8-b46f-c1828de0c5a4-dbus\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:14.843568 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.843464 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62022203-bfe8-44d8-b46f-c1828de0c5a4-dbus\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:14.843568 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.843538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62022203-bfe8-44d8-b46f-c1828de0c5a4-kubelet-config\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:14.843920 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.843596 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:14.843920 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.843655 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret podName:62022203-bfe8-44d8-b46f-c1828de0c5a4 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:15.343637028 +0000 UTC m=+10.881523954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret") pod "global-pull-secret-syncer-8kj2n" (UID: "62022203-bfe8-44d8-b46f-c1828de0c5a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:14.987932 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.987435 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:14.987932 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.987546 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:14.987932 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:14.987648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:14.987932 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:14.987782 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:15.348491 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:15.348382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:15.348636 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:15.348523 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:15.348636 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:15.348586 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret podName:62022203-bfe8-44d8-b46f-c1828de0c5a4 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:16.348568324 +0000 UTC m=+11.886455229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret") pod "global-pull-secret-syncer-8kj2n" (UID: "62022203-bfe8-44d8-b46f-c1828de0c5a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:15.986644 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:15.986608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:15.987107 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:15.986743 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:16.357202 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:16.357114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:16.357352 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:16.357266 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:16.357352 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:16.357349 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret podName:62022203-bfe8-44d8-b46f-c1828de0c5a4 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:18.35732906 +0000 UTC m=+13.895215961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret") pod "global-pull-secret-syncer-8kj2n" (UID: "62022203-bfe8-44d8-b46f-c1828de0c5a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:16.986057 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:16.985980 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:16.986210 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:16.986116 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:16.986210 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:16.986149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:16.986326 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:16.986281 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:17.986074 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:17.986043 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:17.986604 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:17.986160 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:18.369301 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:18.369209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:18.369455 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:18.369348 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:18.369455 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:18.369410 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret podName:62022203-bfe8-44d8-b46f-c1828de0c5a4 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:22.369395437 +0000 UTC m=+17.907282338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret") pod "global-pull-secret-syncer-8kj2n" (UID: "62022203-bfe8-44d8-b46f-c1828de0c5a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:18.986559 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:18.986524 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:18.986992 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:18.986654 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:18.986992 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:18.986692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:18.986992 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:18.986773 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:19.985720 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:19.985687 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:19.985896 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:19.985810 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:20.985866 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:20.985823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:20.986284 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:20.985870 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:20.986284 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:20.985960 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:20.986284 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:20.986073 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:21.985816 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:21.985780 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:21.985982 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:21.985930 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:22.401550 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:22.401447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:22.401706 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.401622 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:22.401763 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.401706 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret podName:62022203-bfe8-44d8-b46f-c1828de0c5a4 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:30.40168486 +0000 UTC m=+25.939571764 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret") pod "global-pull-secret-syncer-8kj2n" (UID: "62022203-bfe8-44d8-b46f-c1828de0c5a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:22.704162 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:22.704067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:22.704312 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.704228 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:22.704370 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.704317 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs podName:61710589-be37-470a-8046-39c730b38313 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.704298087 +0000 UTC m=+34.242184992 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs") pod "network-metrics-daemon-fqxn4" (UID: "61710589-be37-470a-8046-39c730b38313") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:22.804572 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:22.804533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:22.804746 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.804729 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:22.804821 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.804755 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:22.804821 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.804771 2576 projected.go:194] Error preparing data for projected volume kube-api-access-926lj for pod openshift-network-diagnostics/network-check-target-zv2g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:22.804922 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.804839 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj podName:53e6ffe6-1b54-4a2a-8aa1-0a1d310df973 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.804819004 +0000 UTC m=+34.342705919 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-926lj" (UniqueName: "kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj") pod "network-check-target-zv2g2" (UID: "53e6ffe6-1b54-4a2a-8aa1-0a1d310df973") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:22.986272 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:22.986163 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:22.986272 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:22.986167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:22.986749 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.986318 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:22.986749 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:22.986405 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:23.985638 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:23.985595 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:23.985809 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:23.985736 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:24.986800 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:24.986556 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:24.987389 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:24.986578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:24.987389 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:24.986929 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:24.987389 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:24.986966 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:25.075433 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.075407 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/ovn-acl-logging/0.log" Apr 21 07:11:25.075757 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.075737 2576 generic.go:358] "Generic (PLEG): container finished" podID="e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a" containerID="4f733dc1f44e20621f26cbc7680469fbfc0f1268689629f3593e21f977b5503b" exitCode=1 Apr 21 07:11:25.075823 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.075806 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerStarted","Data":"1f8f0b9e72a678be9b0669d351501c055c5b6112849afa585ed681eb6e6b6126"} Apr 21 07:11:25.075873 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.075835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerStarted","Data":"0f83d89b3207271ddad0fda268efef70b382e8116aa7b6660fa085fc57e08241"} Apr 21 07:11:25.075873 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.075846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerStarted","Data":"33110148c32509f7c6071328aa8d25f18343ddc3824c33ae43a98a600864d454"} Apr 21 07:11:25.075873 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.075854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerDied","Data":"4f733dc1f44e20621f26cbc7680469fbfc0f1268689629f3593e21f977b5503b"} Apr 21 07:11:25.075873 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.075864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerStarted","Data":"2181973ac1a89aae8bf65b88424e65e04176c11e4eb1e153fb1bac4837cc1786"} Apr 21 07:11:25.077118 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.077093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r8m6x" event={"ID":"dec13d58-9abe-4cbd-a479-45ceea3970a9","Type":"ContainerStarted","Data":"29758083e2b489dad29501b3331904198fe579fea947e43098e219df3d363fd9"} Apr 21 07:11:25.078524 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.078500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nr6p6" event={"ID":"c5aed744-8c66-4da8-b412-288b462f285b","Type":"ContainerStarted","Data":"feb3e8e1c6dc2055f8ce99e0f79828e614184cfe015f1cb244de05ad5d111051"} Apr 21 07:11:25.079922 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.079893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8w45p" event={"ID":"13d3ab13-d431-4545-bc03-50c6840b6f39","Type":"ContainerStarted","Data":"d29ebade9f31fba53eb8bc6b6240fe0f4a8b78d51098869dcbb4bf1c322cc404"} Apr 21 07:11:25.081556 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.081527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v9k64" event={"ID":"4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66","Type":"ContainerStarted","Data":"f5fcd6897540e6768cdc1d4f83e74f485623d6cceb97ed925f49158c956362a3"} Apr 21 07:11:25.084830 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.083099 2576 generic.go:358] "Generic (PLEG): container finished" podID="e12015e6-2082-4f37-be78-ba178fd7beec" containerID="10fcda5dc1e694be2871b492f5e51d309f8e4db2ae8fd9b0ed46ad0219d5d4d0" exitCode=0 Apr 21 07:11:25.084830 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.083160 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerDied","Data":"10fcda5dc1e694be2871b492f5e51d309f8e4db2ae8fd9b0ed46ad0219d5d4d0"} Apr 21 07:11:25.086012 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.085988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f8dzv" event={"ID":"554c2ac8-64c6-4da1-80ab-4059aee84a3e","Type":"ContainerStarted","Data":"10e5362b25af8edd167cfe34cd3e0f79a466b216cf338badaaf9bf2116007d74"} Apr 21 07:11:25.087595 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.087568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" event={"ID":"fad82241-109b-4dde-923e-45ecd4be2d96","Type":"ContainerStarted","Data":"125a27a1bbe7557bf02ae0e7a61f0c4e726c67fd7026a3d7b3344effb908eeb9"} Apr 21 07:11:25.100373 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.100325 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r8m6x" podStartSLOduration=3.207843236 podStartE2EDuration="20.100312768s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:07.664688027 +0000 UTC m=+3.202574932" lastFinishedPulling="2026-04-21 07:11:24.557157546 +0000 UTC m=+20.095044464" observedRunningTime="2026-04-21 07:11:25.099628064 +0000 UTC m=+20.637515005" watchObservedRunningTime="2026-04-21 07:11:25.100312768 +0000 UTC m=+20.638199691" Apr 21 07:11:25.115721 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.115672 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v9k64" podStartSLOduration=3.269397431 podStartE2EDuration="20.11566006s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:07.665956742 +0000 UTC m=+3.203843645" lastFinishedPulling="2026-04-21 07:11:24.512219358 +0000 UTC m=+20.050106274" observedRunningTime="2026-04-21 07:11:25.115355966 +0000 UTC m=+20.653242901" watchObservedRunningTime="2026-04-21 07:11:25.11566006 +0000 UTC m=+20.653546983" Apr 21 07:11:25.185759 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.185713 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-f8dzv" podStartSLOduration=3.345151034 podStartE2EDuration="20.185695086s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:07.659737121 +0000 UTC m=+3.197624022" lastFinishedPulling="2026-04-21 07:11:24.500281169 +0000 UTC m=+20.038168074" observedRunningTime="2026-04-21 07:11:25.155472106 +0000 UTC m=+20.693359030" watchObservedRunningTime="2026-04-21 07:11:25.185695086 +0000 UTC m=+20.723582010" Apr 21 07:11:25.185894 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.185790 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8w45p" podStartSLOduration=3.327546326 podStartE2EDuration="20.185782977s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:07.666240612 +0000 UTC m=+3.204127513" lastFinishedPulling="2026-04-21 07:11:24.524477263 +0000 UTC m=+20.062364164" observedRunningTime="2026-04-21 07:11:25.185200086 +0000 UTC m=+20.723087009" watchObservedRunningTime="2026-04-21 07:11:25.185782977 +0000 UTC m=+20.723669904" Apr 21 07:11:25.636183 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.636144 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:25.636854 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.636830 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:25.662420 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.662353 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nr6p6" podStartSLOduration=3.825119319 podStartE2EDuration="20.662335029s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:07.663054221 +0000 UTC m=+3.200941123" lastFinishedPulling="2026-04-21 07:11:24.500269918 +0000 UTC m=+20.038156833" observedRunningTime="2026-04-21 07:11:25.203463803 +0000 UTC m=+20.741350725" watchObservedRunningTime="2026-04-21 07:11:25.662335029 +0000 UTC m=+21.200221954" Apr 21 07:11:25.859864 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.859806 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 07:11:25.924232 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.924133 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T07:11:25.859826772Z","UUID":"4ab3c1a6-c425-42ec-8a8c-d90f87d161c8","Handler":null,"Name":"","Endpoint":""} Apr 21 07:11:25.926060 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.926016 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 07:11:25.926060 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.926051 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 07:11:25.986112 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:25.986081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:25.986284 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:25.986211 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:26.091052 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:26.090932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lccf5" event={"ID":"b7def839-f2b6-4ceb-9338-57bbb74327a3","Type":"ContainerStarted","Data":"4870d7bbb27eb856497c4b08cfc862dfbb999f4d54643f90ef1fb5e5227c05e4"} Apr 21 07:11:26.092768 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:26.092738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" event={"ID":"fad82241-109b-4dde-923e-45ecd4be2d96","Type":"ContainerStarted","Data":"be9297808dba1b5f7613a58ef46599c4ca8d921c6b8aa95eba9f55f481f79dce"} Apr 21 07:11:26.095559 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:26.095539 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/ovn-acl-logging/0.log" Apr 21 07:11:26.095975 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:26.095950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerStarted","Data":"0d5a24ea5bd80fa20342f3420e4321d9630c8186aa3270f927ea2ce8585c0687"} Apr 21 07:11:26.108198 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:26.108154 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lccf5" podStartSLOduration=4.254349401 podStartE2EDuration="21.108142463s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:07.658365855 +0000 UTC m=+3.196252770" lastFinishedPulling="2026-04-21 07:11:24.512158927 +0000 UTC m=+20.050045832" observedRunningTime="2026-04-21 07:11:26.107930311 +0000 UTC m=+21.645817248" watchObservedRunningTime="2026-04-21 07:11:26.108142463 +0000 UTC m=+21.646029423" Apr 21 07:11:26.986617 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:26.986532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:26.986617 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:26.986553 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:26.986908 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:26.986684 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:26.986908 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:26.986819 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:27.099428 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:27.099391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" event={"ID":"fad82241-109b-4dde-923e-45ecd4be2d96","Type":"ContainerStarted","Data":"900bf392248f70ea663966cb8a039eb9c8d94a65fef355f33a501bc6aa9756fa"} Apr 21 07:11:27.099835 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:27.099489 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:11:27.122679 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:27.122641 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ln5ss" podStartSLOduration=3.159390586 podStartE2EDuration="22.122625356s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:07.656302722 +0000 UTC m=+3.194189623" lastFinishedPulling="2026-04-21 07:11:26.619537492 +0000 UTC m=+22.157424393" observedRunningTime="2026-04-21 07:11:27.122246379 +0000 UTC m=+22.660133314" watchObservedRunningTime="2026-04-21 07:11:27.122625356 +0000 UTC m=+22.660512316" Apr 21 07:11:27.986205 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:27.986172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:27.986378 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:27.986287 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:28.104303 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:28.104245 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/ovn-acl-logging/0.log" Apr 21 07:11:28.104733 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:28.104662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerStarted","Data":"79070d20a3724756eaa1704e563e64771a889752f13ff7ea887436fa99b27d42"} Apr 21 07:11:28.986547 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:28.986300 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:28.986761 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:28.986353 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:28.986761 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:28.986669 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:28.986873 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:28.986753 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:29.986028 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:29.985778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:29.986738 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:29.986078 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:30.111508 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.111478 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/ovn-acl-logging/0.log" Apr 21 07:11:30.111871 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.111846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerStarted","Data":"4dca0efa0ade8b2c02a81b4b9f17ae6808bf298033b062bbb002688f3d890064"} Apr 21 07:11:30.112251 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.112224 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:30.112373 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.112272 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:30.112373 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.112365 2576 scope.go:117] "RemoveContainer" containerID="4f733dc1f44e20621f26cbc7680469fbfc0f1268689629f3593e21f977b5503b" Apr 21 07:11:30.113661 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.113634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerStarted","Data":"b992676c517f32cdb9b9f8c1c0134a696140962c093de7489d1bc6f2f4b15f85"} Apr 21 07:11:30.127622 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.127603 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:30.460331 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.460299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:30.460461 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:30.460431 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:30.460495 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:30.460485 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret podName:62022203-bfe8-44d8-b46f-c1828de0c5a4 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:46.46047133 +0000 UTC m=+41.998358231 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret") pod "global-pull-secret-syncer-8kj2n" (UID: "62022203-bfe8-44d8-b46f-c1828de0c5a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:30.993568 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.991452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:30.993568 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:30.991619 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:30.993568 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:30.992037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:30.993568 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:30.992197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:31.118912 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.118881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/ovn-acl-logging/0.log" Apr 21 07:11:31.119274 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.119231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" event={"ID":"e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a","Type":"ContainerStarted","Data":"4367d82f3e57a2860cafbe8da773652b96f6c53f4ae156f4aaac75143245a50d"} Apr 21 07:11:31.119473 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.119454 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:31.121091 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.121066 2576 generic.go:358] "Generic (PLEG): container finished" podID="e12015e6-2082-4f37-be78-ba178fd7beec" containerID="b992676c517f32cdb9b9f8c1c0134a696140962c093de7489d1bc6f2f4b15f85" exitCode=0 Apr 21 07:11:31.121218 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.121104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerDied","Data":"b992676c517f32cdb9b9f8c1c0134a696140962c093de7489d1bc6f2f4b15f85"} Apr 21 07:11:31.134452 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.134432 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:11:31.152795 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.152757 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" podStartSLOduration=9.236381257 podStartE2EDuration="26.152745217s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:07.667435677 +0000 UTC m=+3.205322579" lastFinishedPulling="2026-04-21 07:11:24.583799624 +0000 UTC m=+20.121686539" observedRunningTime="2026-04-21 07:11:31.151304609 +0000 UTC m=+26.689191531" watchObservedRunningTime="2026-04-21 07:11:31.152745217 +0000 UTC m=+26.690632145" Apr 21 07:11:31.699990 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.699959 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8kj2n"] Apr 21 07:11:31.700298 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.700102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:31.700298 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:31.700205 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:31.703064 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.702982 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fqxn4"] Apr 21 07:11:31.703190 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.703087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:31.703300 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:31.703197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:31.703733 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.703708 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zv2g2"] Apr 21 07:11:31.703847 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:31.703779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:31.703903 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:31.703862 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:32.986081 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:32.986054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:32.986464 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:32.986054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:32.986464 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:32.986153 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:32.986464 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:32.986212 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:33.125904 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:33.125865 2576 generic.go:358] "Generic (PLEG): container finished" podID="e12015e6-2082-4f37-be78-ba178fd7beec" containerID="5852c4f9b047b3588f883ffec709514531ab46aa7a659297709d661729929526" exitCode=0 Apr 21 07:11:33.126052 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:33.125916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerDied","Data":"5852c4f9b047b3588f883ffec709514531ab46aa7a659297709d661729929526"} Apr 21 07:11:33.986494 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:33.986342 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:33.986753 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:33.986580 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:34.129891 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:34.129869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerStarted","Data":"d02a350ddcfa31e600caecf4388f7d1886699ccea16389f36a353d784c28b447"} Apr 21 07:11:34.986889 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:34.986858 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:34.987292 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:34.986942 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:34.987292 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:34.986983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:34.987292 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:34.987047 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:35.133070 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:35.133038 2576 generic.go:358] "Generic (PLEG): container finished" podID="e12015e6-2082-4f37-be78-ba178fd7beec" containerID="d02a350ddcfa31e600caecf4388f7d1886699ccea16389f36a353d784c28b447" exitCode=0 Apr 21 07:11:35.133227 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:35.133075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerDied","Data":"d02a350ddcfa31e600caecf4388f7d1886699ccea16389f36a353d784c28b447"} Apr 21 07:11:35.986592 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:35.986555 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:35.986744 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:35.986694 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fqxn4" podUID="61710589-be37-470a-8046-39c730b38313" Apr 21 07:11:36.110186 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:36.110147 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:36.110589 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:36.110309 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:11:36.110769 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:36.110756 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-f8dzv" Apr 21 07:11:36.985948 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:36.985915 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:36.986123 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:36.986037 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8kj2n" podUID="62022203-bfe8-44d8-b46f-c1828de0c5a4" Apr 21 07:11:36.986123 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:36.986102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:36.986241 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:36.986193 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zv2g2" podUID="53e6ffe6-1b54-4a2a-8aa1-0a1d310df973" Apr 21 07:11:37.337323 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.337293 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-104.ec2.internal" event="NodeReady" Apr 21 07:11:37.337851 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.337435 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 07:11:37.377352 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.377321 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7798d89bcd-ptk56"] Apr 21 07:11:37.399041 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.399014 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gv622"] Apr 21 07:11:37.399185 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.399167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.402063 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.401871 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 07:11:37.402063 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.401962 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 07:11:37.402570 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.402525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 07:11:37.404592 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.404572 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sgb5p\"" Apr 21 07:11:37.414646 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.414605 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 07:11:37.425139 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.425069 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7798d89bcd-ptk56"] Apr 21 07:11:37.425139 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.425101 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6qpz8"] Apr 21 07:11:37.425316 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.425211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.428751 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.428730 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 07:11:37.428853 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.428793 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-b24jv\"" Apr 21 07:11:37.428975 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.428956 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 07:11:37.439607 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.439588 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6qpz8"] Apr 21 07:11:37.439607 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.439610 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gv622"] Apr 21 07:11:37.439744 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.439719 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:37.442784 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.442466 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 07:11:37.442784 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.442642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vr4v6\"" Apr 21 07:11:37.442784 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.442676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 07:11:37.442784 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.442699 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 07:11:37.516314 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.516314 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.516314 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/98939484-0f22-46f7-9460-702c1eb19754-tmp-dir\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.516619 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-installation-pull-secrets\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.516619 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-image-registry-private-configuration\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.516619 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x82cx\" (UniqueName: \"kubernetes.io/projected/98939484-0f22-46f7-9460-702c1eb19754-kube-api-access-x82cx\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.516619 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-ca-trust-extracted\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.516619 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-bound-sa-token\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.516619 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rw9l\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-kube-api-access-9rw9l\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.516943 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-trusted-ca\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.516943 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-certificates\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.516943 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.516706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98939484-0f22-46f7-9460-702c1eb19754-config-volume\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.617012 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.616979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-certificates\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.617220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98939484-0f22-46f7-9460-702c1eb19754-config-volume\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.617220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:37.617220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.617220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.617220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/98939484-0f22-46f7-9460-702c1eb19754-tmp-dir\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.617220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-installation-pull-secrets\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.617220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-image-registry-private-configuration\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.617220 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x82cx\" (UniqueName: \"kubernetes.io/projected/98939484-0f22-46f7-9460-702c1eb19754-kube-api-access-x82cx\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:37.617234 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:37.617267 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7798d89bcd-ptk56: secret "image-registry-tls" not found Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-ca-trust-extracted\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:37.617340 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls podName:ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.117319366 +0000 UTC m=+33.655206269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls") pod "image-registry-7798d89bcd-ptk56" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd") : secret "image-registry-tls" not found Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-bound-sa-token\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:37.617386 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rw9l\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-kube-api-access-9rw9l\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhdsf\" (UniqueName: \"kubernetes.io/projected/1ca15324-b979-4a39-9c0a-defe74d51dd0-kube-api-access-hhdsf\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:37.617441 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls podName:98939484-0f22-46f7-9460-702c1eb19754 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.117424477 +0000 UTC m=+33.655311379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls") pod "dns-default-gv622" (UID: "98939484-0f22-46f7-9460-702c1eb19754") : secret "dns-default-metrics-tls" not found Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/98939484-0f22-46f7-9460-702c1eb19754-tmp-dir\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.617705 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-ca-trust-extracted\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.618201 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.617704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-trusted-ca\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.618437 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.618413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-certificates\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.618753 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.618731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-trusted-ca\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.622204 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.622184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-installation-pull-secrets\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.622311 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.622219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-image-registry-private-configuration\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.627285 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.627232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98939484-0f22-46f7-9460-702c1eb19754-config-volume\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.629547 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.629520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rw9l\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-kube-api-access-9rw9l\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.630229 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.630206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x82cx\" (UniqueName: \"kubernetes.io/projected/98939484-0f22-46f7-9460-702c1eb19754-kube-api-access-x82cx\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:37.630982 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.630958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-bound-sa-token\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:37.718938 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.718907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhdsf\" (UniqueName: \"kubernetes.io/projected/1ca15324-b979-4a39-9c0a-defe74d51dd0-kube-api-access-hhdsf\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:37.719072 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.718973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:37.719124 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:37.719101 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:37.719180 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:37.719170 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert podName:1ca15324-b979-4a39-9c0a-defe74d51dd0 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.219153386 +0000 UTC m=+33.757040298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert") pod "ingress-canary-6qpz8" (UID: "1ca15324-b979-4a39-9c0a-defe74d51dd0") : secret "canary-serving-cert" not found Apr 21 07:11:37.729716 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.729689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhdsf\" (UniqueName: \"kubernetes.io/projected/1ca15324-b979-4a39-9c0a-defe74d51dd0-kube-api-access-hhdsf\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:37.986356 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.986329 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:37.989230 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.989210 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:11:37.989343 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:37.989220 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-92shx\"" Apr 21 07:11:38.123529 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.123494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:38.123713 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.123547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:38.123713 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.123620 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:38.123713 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.123645 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7798d89bcd-ptk56: secret "image-registry-tls" not found Apr 21 07:11:38.123713 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.123683 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:38.123713 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.123705 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls podName:ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd nodeName:}" failed. No retries permitted until 2026-04-21 07:11:39.123685363 +0000 UTC m=+34.661572266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls") pod "image-registry-7798d89bcd-ptk56" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd") : secret "image-registry-tls" not found Apr 21 07:11:38.123896 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.123729 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls podName:98939484-0f22-46f7-9460-702c1eb19754 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:39.123718406 +0000 UTC m=+34.661605315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls") pod "dns-default-gv622" (UID: "98939484-0f22-46f7-9460-702c1eb19754") : secret "dns-default-metrics-tls" not found Apr 21 07:11:38.224164 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.224127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:38.224352 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.224320 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:38.224415 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.224402 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert podName:1ca15324-b979-4a39-9c0a-defe74d51dd0 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:39.224384845 +0000 UTC m=+34.762271749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert") pod "ingress-canary-6qpz8" (UID: "1ca15324-b979-4a39-9c0a-defe74d51dd0") : secret "canary-serving-cert" not found Apr 21 07:11:38.729359 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.729321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:11:38.730023 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.729472 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 07:11:38.730023 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.729555 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs podName:61710589-be37-470a-8046-39c730b38313 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:10.729538434 +0000 UTC m=+66.267425336 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs") pod "network-metrics-daemon-fqxn4" (UID: "61710589-be37-470a-8046-39c730b38313") : secret "metrics-daemon-secret" not found Apr 21 07:11:38.830148 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.830102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:38.830350 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.830297 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:38.830350 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.830320 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:38.830350 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.830332 2576 projected.go:194] Error preparing data for projected volume kube-api-access-926lj for pod openshift-network-diagnostics/network-check-target-zv2g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:38.830506 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:38.830383 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj podName:53e6ffe6-1b54-4a2a-8aa1-0a1d310df973 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:10.830369359 +0000 UTC m=+66.368256260 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-926lj" (UniqueName: "kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj") pod "network-check-target-zv2g2" (UID: "53e6ffe6-1b54-4a2a-8aa1-0a1d310df973") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:38.986274 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.986179 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:11:38.986630 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.986515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:38.989791 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.989767 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 07:11:38.989791 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.989783 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:11:38.990748 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.990726 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7djql\"" Apr 21 07:11:38.990855 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:38.990735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:11:39.133829 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:39.133790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:39.133829 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:39.133838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:39.134072 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:39.133952 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:39.134072 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:39.133964 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:39.134072 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:39.133986 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7798d89bcd-ptk56: secret "image-registry-tls" not found Apr 21 07:11:39.134072 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:39.134011 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls podName:98939484-0f22-46f7-9460-702c1eb19754 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:41.13399342 +0000 UTC m=+36.671880321 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls") pod "dns-default-gv622" (UID: "98939484-0f22-46f7-9460-702c1eb19754") : secret "dns-default-metrics-tls" not found Apr 21 07:11:39.134072 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:39.134044 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls podName:ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd nodeName:}" failed. No retries permitted until 2026-04-21 07:11:41.134024881 +0000 UTC m=+36.671911785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls") pod "image-registry-7798d89bcd-ptk56" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd") : secret "image-registry-tls" not found Apr 21 07:11:39.235461 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:39.235429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:39.235624 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:39.235597 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:39.235691 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:39.235666 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert podName:1ca15324-b979-4a39-9c0a-defe74d51dd0 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:41.235645014 +0000 UTC m=+36.773531914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert") pod "ingress-canary-6qpz8" (UID: "1ca15324-b979-4a39-9c0a-defe74d51dd0") : secret "canary-serving-cert" not found Apr 21 07:11:41.153216 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:41.153075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:41.153216 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:41.153121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:41.153664 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:41.153241 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:41.153664 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:41.153284 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7798d89bcd-ptk56: secret "image-registry-tls" not found Apr 21 07:11:41.153664 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:41.153243 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:41.153664 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:41.153356 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls podName:ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd nodeName:}" failed. No retries permitted until 2026-04-21 07:11:45.15333592 +0000 UTC m=+40.691222823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls") pod "image-registry-7798d89bcd-ptk56" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd") : secret "image-registry-tls" not found Apr 21 07:11:41.153664 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:41.153425 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls podName:98939484-0f22-46f7-9460-702c1eb19754 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:45.153405268 +0000 UTC m=+40.691292181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls") pod "dns-default-gv622" (UID: "98939484-0f22-46f7-9460-702c1eb19754") : secret "dns-default-metrics-tls" not found Apr 21 07:11:41.254682 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:41.254507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:41.254778 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:41.254653 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:41.254778 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:41.254764 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert podName:1ca15324-b979-4a39-9c0a-defe74d51dd0 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:45.254749757 +0000 UTC m=+40.792636658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert") pod "ingress-canary-6qpz8" (UID: "1ca15324-b979-4a39-9c0a-defe74d51dd0") : secret "canary-serving-cert" not found Apr 21 07:11:42.148465 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:42.148429 2576 generic.go:358] "Generic (PLEG): container finished" podID="e12015e6-2082-4f37-be78-ba178fd7beec" containerID="11c2a70bd4ec697c40aaa7b0a6529f108ebaeecbf8e2b5af07f15a67eff3d0b0" exitCode=0 Apr 21 07:11:42.148465 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:42.148470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerDied","Data":"11c2a70bd4ec697c40aaa7b0a6529f108ebaeecbf8e2b5af07f15a67eff3d0b0"} Apr 21 07:11:43.152803 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:43.152774 2576 generic.go:358] "Generic (PLEG): container finished" podID="e12015e6-2082-4f37-be78-ba178fd7beec" containerID="d77a94daa9019d4e390684ef6eda1618c6f6ea5935a636d4bba2aa2cba9c2d46" exitCode=0 Apr 21 07:11:43.153169 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:43.152834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerDied","Data":"d77a94daa9019d4e390684ef6eda1618c6f6ea5935a636d4bba2aa2cba9c2d46"} Apr 21 07:11:44.158037 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:44.158000 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" event={"ID":"e12015e6-2082-4f37-be78-ba178fd7beec","Type":"ContainerStarted","Data":"f2384a2289f714f9237b428e1a1d7ca133b2cbfb66d4ccfe0f4fb24cbd632f45"} Apr 21 07:11:45.182011 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.181978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:45.182011 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.182012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:45.182526 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:45.182129 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:45.182526 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:45.182154 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7798d89bcd-ptk56: secret "image-registry-tls" not found Apr 21 07:11:45.182526 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:45.182212 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls podName:ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd nodeName:}" failed. No retries permitted until 2026-04-21 07:11:53.182193252 +0000 UTC m=+48.720080170 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls") pod "image-registry-7798d89bcd-ptk56" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd") : secret "image-registry-tls" not found Apr 21 07:11:45.182526 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:45.182126 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:45.182526 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:45.182299 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls podName:98939484-0f22-46f7-9460-702c1eb19754 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:53.182279689 +0000 UTC m=+48.720166601 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls") pod "dns-default-gv622" (UID: "98939484-0f22-46f7-9460-702c1eb19754") : secret "dns-default-metrics-tls" not found Apr 21 07:11:45.283126 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.283095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:45.283278 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:45.283244 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:45.283330 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:45.283321 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert podName:1ca15324-b979-4a39-9c0a-defe74d51dd0 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:53.283305663 +0000 UTC m=+48.821192563 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert") pod "ingress-canary-6qpz8" (UID: "1ca15324-b979-4a39-9c0a-defe74d51dd0") : secret "canary-serving-cert" not found Apr 21 07:11:45.765057 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.765005 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vzbdr" podStartSLOduration=7.339495443 podStartE2EDuration="40.764987574s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:07.662426765 +0000 UTC m=+3.200313677" lastFinishedPulling="2026-04-21 07:11:41.087918908 +0000 UTC m=+36.625805808" observedRunningTime="2026-04-21 07:11:44.21515195 +0000 UTC m=+39.753038874" watchObservedRunningTime="2026-04-21 07:11:45.764987574 +0000 UTC m=+41.302874498" Apr 21 07:11:45.765749 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.765721 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r"] Apr 21 07:11:45.768818 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.768804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r" Apr 21 07:11:45.776617 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.776599 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 07:11:45.776935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.776922 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-b554z\"" Apr 21 07:11:45.777442 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.777430 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 07:11:45.785833 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.785813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndx8v\" (UniqueName: \"kubernetes.io/projected/17117959-58a1-463d-89ab-64b5b61b1443-kube-api-access-ndx8v\") pod \"migrator-74bb7799d9-5k65r\" (UID: \"17117959-58a1-463d-89ab-64b5b61b1443\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r" Apr 21 07:11:45.794833 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.794812 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r"] Apr 21 07:11:45.886154 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.886128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndx8v\" (UniqueName: \"kubernetes.io/projected/17117959-58a1-463d-89ab-64b5b61b1443-kube-api-access-ndx8v\") pod \"migrator-74bb7799d9-5k65r\" (UID: \"17117959-58a1-463d-89ab-64b5b61b1443\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r" Apr 21 07:11:45.897309 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:45.897285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndx8v\" (UniqueName: \"kubernetes.io/projected/17117959-58a1-463d-89ab-64b5b61b1443-kube-api-access-ndx8v\") pod \"migrator-74bb7799d9-5k65r\" (UID: \"17117959-58a1-463d-89ab-64b5b61b1443\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r" Apr 21 07:11:46.076752 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:46.076685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r" Apr 21 07:11:46.201793 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:46.201761 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r"] Apr 21 07:11:46.206166 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:46.206136 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17117959_58a1_463d_89ab_64b5b61b1443.slice/crio-23c5d2f3e602cf8add32311eea46cb51544963bb52213d5f1f2ffcf25af70ee5 WatchSource:0}: Error finding container 23c5d2f3e602cf8add32311eea46cb51544963bb52213d5f1f2ffcf25af70ee5: Status 404 returned error can't find the container with id 23c5d2f3e602cf8add32311eea46cb51544963bb52213d5f1f2ffcf25af70ee5 Apr 21 07:11:46.489199 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:46.489162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:46.491351 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:46.491334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62022203-bfe8-44d8-b46f-c1828de0c5a4-original-pull-secret\") pod \"global-pull-secret-syncer-8kj2n\" (UID: \"62022203-bfe8-44d8-b46f-c1828de0c5a4\") " pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:46.503401 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:46.503381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kj2n" Apr 21 07:11:46.633678 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:46.633648 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8kj2n"] Apr 21 07:11:46.637404 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:11:46.637379 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62022203_bfe8_44d8_b46f_c1828de0c5a4.slice/crio-b50a052490c3c176ae859a958a291ddbf6bb633c0dbb2ac6e6606916ea87830c WatchSource:0}: Error finding container b50a052490c3c176ae859a958a291ddbf6bb633c0dbb2ac6e6606916ea87830c: Status 404 returned error can't find the container with id b50a052490c3c176ae859a958a291ddbf6bb633c0dbb2ac6e6606916ea87830c Apr 21 07:11:47.163899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:47.163844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8kj2n" event={"ID":"62022203-bfe8-44d8-b46f-c1828de0c5a4","Type":"ContainerStarted","Data":"b50a052490c3c176ae859a958a291ddbf6bb633c0dbb2ac6e6606916ea87830c"} Apr 21 07:11:47.164914 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:47.164886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r" event={"ID":"17117959-58a1-463d-89ab-64b5b61b1443","Type":"ContainerStarted","Data":"23c5d2f3e602cf8add32311eea46cb51544963bb52213d5f1f2ffcf25af70ee5"} Apr 21 07:11:48.562880 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:48.562811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nr6p6_c5aed744-8c66-4da8-b412-288b462f285b/dns-node-resolver/0.log" Apr 21 07:11:49.171488 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:49.171450 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r" event={"ID":"17117959-58a1-463d-89ab-64b5b61b1443","Type":"ContainerStarted","Data":"8fd1a4e55f96b0c8d81c2a4cc229c52907fef5379ab250b1862549f92856523e"} Apr 21 07:11:49.171488 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:49.171493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r" event={"ID":"17117959-58a1-463d-89ab-64b5b61b1443","Type":"ContainerStarted","Data":"7601a48bf73f8b6e3c0a649d6570647c226939acd351fca3e98c0b0873eeac6c"} Apr 21 07:11:49.200962 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:49.200906 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5k65r" podStartSLOduration=2.139350325 podStartE2EDuration="4.200888467s" podCreationTimestamp="2026-04-21 07:11:45 +0000 UTC" firstStartedPulling="2026-04-21 07:11:46.207914819 +0000 UTC m=+41.745801720" lastFinishedPulling="2026-04-21 07:11:48.26945295 +0000 UTC m=+43.807339862" observedRunningTime="2026-04-21 07:11:49.199039704 +0000 UTC m=+44.736926627" watchObservedRunningTime="2026-04-21 07:11:49.200888467 +0000 UTC m=+44.738775394" Apr 21 07:11:49.760194 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:49.760165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v9k64_4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66/node-ca/0.log" Apr 21 07:11:50.568005 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:50.567974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5k65r_17117959-58a1-463d-89ab-64b5b61b1443/migrator/0.log" Apr 21 07:11:50.757940 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:50.757916 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5k65r_17117959-58a1-463d-89ab-64b5b61b1443/graceful-termination/0.log" Apr 21 07:11:51.177487 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:51.177441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8kj2n" event={"ID":"62022203-bfe8-44d8-b46f-c1828de0c5a4","Type":"ContainerStarted","Data":"706ba336f263aac46cec88fafb813168250c6d0bc67a381a3c071ba5e04b6f3a"} Apr 21 07:11:51.196695 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:51.196555 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8kj2n" podStartSLOduration=32.845979003 podStartE2EDuration="37.196537984s" podCreationTimestamp="2026-04-21 07:11:14 +0000 UTC" firstStartedPulling="2026-04-21 07:11:46.639035537 +0000 UTC m=+42.176922439" lastFinishedPulling="2026-04-21 07:11:50.989594503 +0000 UTC m=+46.527481420" observedRunningTime="2026-04-21 07:11:51.196064821 +0000 UTC m=+46.733951744" watchObservedRunningTime="2026-04-21 07:11:51.196537984 +0000 UTC m=+46.734424907" Apr 21 07:11:53.237870 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:53.237835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:11:53.237870 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:53.237870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:11:53.238330 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:53.237968 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:53.238330 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:53.237975 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:53.238330 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:53.237991 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7798d89bcd-ptk56: secret "image-registry-tls" not found Apr 21 07:11:53.238330 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:53.238017 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls podName:98939484-0f22-46f7-9460-702c1eb19754 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:09.238004422 +0000 UTC m=+64.775891323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls") pod "dns-default-gv622" (UID: "98939484-0f22-46f7-9460-702c1eb19754") : secret "dns-default-metrics-tls" not found Apr 21 07:11:53.238330 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:53.238036 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls podName:ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd nodeName:}" failed. No retries permitted until 2026-04-21 07:12:09.238023507 +0000 UTC m=+64.775910411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls") pod "image-registry-7798d89bcd-ptk56" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd") : secret "image-registry-tls" not found Apr 21 07:11:53.338246 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:11:53.338217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:11:53.338382 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:53.338354 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:53.338422 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:11:53.338411 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert podName:1ca15324-b979-4a39-9c0a-defe74d51dd0 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:09.338396475 +0000 UTC m=+64.876283376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert") pod "ingress-canary-6qpz8" (UID: "1ca15324-b979-4a39-9c0a-defe74d51dd0") : secret "canary-serving-cert" not found Apr 21 07:12:03.140156 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:03.140128 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hmsxs" Apr 21 07:12:09.036194 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.036159 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7798d89bcd-ptk56"] Apr 21 07:12:09.036631 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:12:09.036346 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" podUID="ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" Apr 21 07:12:09.168043 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.168002 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-g8hn6"] Apr 21 07:12:09.171703 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.171680 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.186121 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.186091 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 07:12:09.186121 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.186097 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 07:12:09.189629 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.189612 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 07:12:09.202944 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.202925 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-9kvhm\"" Apr 21 07:12:09.207694 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.207670 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 07:12:09.208429 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.208412 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g8hn6"] Apr 21 07:12:09.209899 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.209880 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:12:09.213656 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.213639 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:12:09.253057 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253034 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-bound-sa-token\") pod \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " Apr 21 07:12:09.253164 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253061 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-trusted-ca\") pod \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " Apr 21 07:12:09.253164 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253086 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-certificates\") pod \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " Apr 21 07:12:09.253164 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253127 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-installation-pull-secrets\") pod \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " Apr 21 07:12:09.253164 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253155 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-image-registry-private-configuration\") pod \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " Apr 21 07:12:09.253389 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253177 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rw9l\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-kube-api-access-9rw9l\") pod \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " Apr 21 07:12:09.253389 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253203 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-ca-trust-extracted\") pod \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " Apr 21 07:12:09.253389 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:12:09.253389 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0899e970-711f-4417-b5e5-e887c988472c-crio-socket\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.253389 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0899e970-711f-4417-b5e5-e887c988472c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.253692 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0899e970-711f-4417-b5e5-e887c988472c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.253692 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4l26\" (UniqueName: \"kubernetes.io/projected/0899e970-711f-4417-b5e5-e887c988472c-kube-api-access-h4l26\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.253692 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0899e970-711f-4417-b5e5-e887c988472c-data-volume\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.253692 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:12:09.253692 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253656 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:12:09.253936 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.253711 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:12:09.254038 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.254018 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:12:09.255906 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.255855 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:12:09.255997 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.255949 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:12:09.256064 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.256000 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-kube-api-access-9rw9l" (OuterVolumeSpecName: "kube-api-access-9rw9l") pod "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd"). InnerVolumeSpecName "kube-api-access-9rw9l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:12:09.256287 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.256239 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:12:09.256573 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.256553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"image-registry-7798d89bcd-ptk56\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:12:09.256621 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.256601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98939484-0f22-46f7-9460-702c1eb19754-metrics-tls\") pod \"dns-default-gv622\" (UID: \"98939484-0f22-46f7-9460-702c1eb19754\") " pod="openshift-dns/dns-default-gv622" Apr 21 07:12:09.354603 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354504 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") pod \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\" (UID: \"ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd\") " Apr 21 07:12:09.354751 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0899e970-711f-4417-b5e5-e887c988472c-crio-socket\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.354751 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0899e970-711f-4417-b5e5-e887c988472c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.354751 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0899e970-711f-4417-b5e5-e887c988472c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.354751 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4l26\" (UniqueName: \"kubernetes.io/projected/0899e970-711f-4417-b5e5-e887c988472c-kube-api-access-h4l26\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.354957 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0899e970-711f-4417-b5e5-e887c988472c-crio-socket\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.354957 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0899e970-711f-4417-b5e5-e887c988472c-data-volume\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.354957 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:12:09.354957 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354919 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-bound-sa-token\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:12:09.354957 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354934 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-trusted-ca\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:12:09.354957 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354948 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-certificates\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:12:09.355228 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354964 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-installation-pull-secrets\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:12:09.355228 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354981 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-image-registry-private-configuration\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:12:09.355228 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.354995 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9rw9l\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-kube-api-access-9rw9l\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:12:09.355228 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.355010 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-ca-trust-extracted\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:12:09.355228 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.355112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0899e970-711f-4417-b5e5-e887c988472c-data-volume\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.355572 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.355548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0899e970-711f-4417-b5e5-e887c988472c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.357363 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.357082 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" (UID: "ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:12:09.360164 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.360141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0899e970-711f-4417-b5e5-e887c988472c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.361180 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.361156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca15324-b979-4a39-9c0a-defe74d51dd0-cert\") pod \"ingress-canary-6qpz8\" (UID: \"1ca15324-b979-4a39-9c0a-defe74d51dd0\") " pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:12:09.369665 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.369642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4l26\" (UniqueName: \"kubernetes.io/projected/0899e970-711f-4417-b5e5-e887c988472c-kube-api-access-h4l26\") pod \"insights-runtime-extractor-g8hn6\" (UID: \"0899e970-711f-4417-b5e5-e887c988472c\") " pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.455545 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.455494 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd-registry-tls\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:12:09.479802 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.479774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g8hn6" Apr 21 07:12:09.538845 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.538676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-b24jv\"" Apr 21 07:12:09.546061 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.546039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gv622" Apr 21 07:12:09.556431 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.556406 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vr4v6\"" Apr 21 07:12:09.560596 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.560531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6qpz8" Apr 21 07:12:09.608556 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.608478 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g8hn6"] Apr 21 07:12:09.613171 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:09.612976 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0899e970_711f_4417_b5e5_e887c988472c.slice/crio-297413e2053440642ccc33aba8084ec074f6d8e9a1e16ece746fb77ce5f3a990 WatchSource:0}: Error finding container 297413e2053440642ccc33aba8084ec074f6d8e9a1e16ece746fb77ce5f3a990: Status 404 returned error can't find the container with id 297413e2053440642ccc33aba8084ec074f6d8e9a1e16ece746fb77ce5f3a990 Apr 21 07:12:09.685378 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.685357 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gv622"] Apr 21 07:12:09.707010 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:09.706983 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6qpz8"] Apr 21 07:12:09.710341 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:09.710317 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca15324_b979_4a39_9c0a_defe74d51dd0.slice/crio-66bece4584d27eb6ef97a62fdd1c05875df41be9c5d4c0c96211f5c9ae5c7625 WatchSource:0}: Error finding container 66bece4584d27eb6ef97a62fdd1c05875df41be9c5d4c0c96211f5c9ae5c7625: Status 404 returned error can't find the container with id 66bece4584d27eb6ef97a62fdd1c05875df41be9c5d4c0c96211f5c9ae5c7625 Apr 21 07:12:10.214935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.214890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g8hn6" event={"ID":"0899e970-711f-4417-b5e5-e887c988472c","Type":"ContainerStarted","Data":"7fd949b3010a50cda5f80eb063f552598f5673d29528d7847f185ed20fee21d3"} Apr 21 07:12:10.214935 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.214937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g8hn6" event={"ID":"0899e970-711f-4417-b5e5-e887c988472c","Type":"ContainerStarted","Data":"297413e2053440642ccc33aba8084ec074f6d8e9a1e16ece746fb77ce5f3a990"} Apr 21 07:12:10.216796 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.216766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6qpz8" event={"ID":"1ca15324-b979-4a39-9c0a-defe74d51dd0","Type":"ContainerStarted","Data":"66bece4584d27eb6ef97a62fdd1c05875df41be9c5d4c0c96211f5c9ae5c7625"} Apr 21 07:12:10.218545 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.218515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gv622" event={"ID":"98939484-0f22-46f7-9460-702c1eb19754","Type":"ContainerStarted","Data":"520880c96d996f9973300e787f07b814cc55587346a7a6f5018dbcb318a17c92"} Apr 21 07:12:10.218631 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.218559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7798d89bcd-ptk56" Apr 21 07:12:10.292435 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.292375 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7798d89bcd-ptk56"] Apr 21 07:12:10.292604 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.292549 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7798d89bcd-ptk56"] Apr 21 07:12:10.766138 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.766102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:12:10.769174 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.769060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61710589-be37-470a-8046-39c730b38313-metrics-certs\") pod \"network-metrics-daemon-fqxn4\" (UID: \"61710589-be37-470a-8046-39c730b38313\") " pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:12:10.866826 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.866789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:12:10.869738 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.869715 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:12:10.879776 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.879756 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:12:10.890325 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.890298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-926lj\" (UniqueName: \"kubernetes.io/projected/53e6ffe6-1b54-4a2a-8aa1-0a1d310df973-kube-api-access-926lj\") pod \"network-check-target-zv2g2\" (UID: \"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973\") " pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:12:10.989019 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.988984 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd" path="/var/lib/kubelet/pods/ac5dc4b6-ecac-48e3-a95e-967b5e23e0bd/volumes" Apr 21 07:12:10.997546 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:10.997524 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-92shx\"" Apr 21 07:12:11.004970 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:11.004951 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fqxn4" Apr 21 07:12:11.101583 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:11.101500 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7djql\"" Apr 21 07:12:11.108144 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:11.108081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:12:11.223043 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:11.223004 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g8hn6" event={"ID":"0899e970-711f-4417-b5e5-e887c988472c","Type":"ContainerStarted","Data":"6dd7418d1fcf64170cdd21e65db7eb85a416fdc11ab66badb0583991f85d9e51"} Apr 21 07:12:11.659650 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:11.659605 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fqxn4"] Apr 21 07:12:11.682531 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:11.682469 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zv2g2"] Apr 21 07:12:12.073086 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:12.072940 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61710589_be37_470a_8046_39c730b38313.slice/crio-62208b379c785f996922522ee07891aa681d4c06a443564d6b37c194d4706a31 WatchSource:0}: Error finding container 62208b379c785f996922522ee07891aa681d4c06a443564d6b37c194d4706a31: Status 404 returned error can't find the container with id 62208b379c785f996922522ee07891aa681d4c06a443564d6b37c194d4706a31 Apr 21 07:12:12.073678 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:12.073653 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e6ffe6_1b54_4a2a_8aa1_0a1d310df973.slice/crio-e8958db80fc324d5a724d2bdc3357bd4179e3f5ca0a3de6d7fce4453f02155a1 WatchSource:0}: Error finding container e8958db80fc324d5a724d2bdc3357bd4179e3f5ca0a3de6d7fce4453f02155a1: Status 404 returned error can't find the container with id e8958db80fc324d5a724d2bdc3357bd4179e3f5ca0a3de6d7fce4453f02155a1 Apr 21 07:12:12.227035 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.226993 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6qpz8" event={"ID":"1ca15324-b979-4a39-9c0a-defe74d51dd0","Type":"ContainerStarted","Data":"d04fd51c36dd10cec6446e3896862b9ff7fa1d23aea88544c6275fb3e0391cdc"} Apr 21 07:12:12.228041 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.228016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fqxn4" event={"ID":"61710589-be37-470a-8046-39c730b38313","Type":"ContainerStarted","Data":"62208b379c785f996922522ee07891aa681d4c06a443564d6b37c194d4706a31"} Apr 21 07:12:12.228964 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.228942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zv2g2" event={"ID":"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973","Type":"ContainerStarted","Data":"e8958db80fc324d5a724d2bdc3357bd4179e3f5ca0a3de6d7fce4453f02155a1"} Apr 21 07:12:12.230342 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.230321 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gv622" event={"ID":"98939484-0f22-46f7-9460-702c1eb19754","Type":"ContainerStarted","Data":"06e73b2c35eb4533b6801f265d52d0c2e197669eae285773e48bd3cf75dad1f3"} Apr 21 07:12:12.230342 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.230345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gv622" event={"ID":"98939484-0f22-46f7-9460-702c1eb19754","Type":"ContainerStarted","Data":"75722560353baf862eb84ecbd36042203fb4d50a845ef7f2fa747b365a7bd9b4"} Apr 21 07:12:12.230484 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.230467 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gv622" Apr 21 07:12:12.231954 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.231928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g8hn6" event={"ID":"0899e970-711f-4417-b5e5-e887c988472c","Type":"ContainerStarted","Data":"2f9cf6d973bb1ec9ff538ffb49c4f4c2da3bb5ee75a2438db90b698d7bef754c"} Apr 21 07:12:12.252458 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.252418 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6qpz8" podStartSLOduration=33.464229297 podStartE2EDuration="35.252407705s" podCreationTimestamp="2026-04-21 07:11:37 +0000 UTC" firstStartedPulling="2026-04-21 07:12:09.712023523 +0000 UTC m=+65.249910424" lastFinishedPulling="2026-04-21 07:12:11.500201913 +0000 UTC m=+67.038088832" observedRunningTime="2026-04-21 07:12:12.251122199 +0000 UTC m=+67.789009121" watchObservedRunningTime="2026-04-21 07:12:12.252407705 +0000 UTC m=+67.790294619" Apr 21 07:12:12.306040 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.305992 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gv622" podStartSLOduration=33.501891038 podStartE2EDuration="35.305979126s" podCreationTimestamp="2026-04-21 07:11:37 +0000 UTC" firstStartedPulling="2026-04-21 07:12:09.691343004 +0000 UTC m=+65.229229905" lastFinishedPulling="2026-04-21 07:12:11.495431085 +0000 UTC m=+67.033317993" observedRunningTime="2026-04-21 07:12:12.305003113 +0000 UTC m=+67.842890037" watchObservedRunningTime="2026-04-21 07:12:12.305979126 +0000 UTC m=+67.843866049" Apr 21 07:12:12.306158 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:12.306073 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-g8hn6" podStartSLOduration=0.881117186 podStartE2EDuration="3.306066746s" podCreationTimestamp="2026-04-21 07:12:09 +0000 UTC" firstStartedPulling="2026-04-21 07:12:09.674815615 +0000 UTC m=+65.212702527" lastFinishedPulling="2026-04-21 07:12:12.099765169 +0000 UTC m=+67.637652087" observedRunningTime="2026-04-21 07:12:12.280303404 +0000 UTC m=+67.818190327" watchObservedRunningTime="2026-04-21 07:12:12.306066746 +0000 UTC m=+67.843953672" Apr 21 07:12:13.735730 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:13.735692 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd"] Apr 21 07:12:13.738882 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:13.738861 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" Apr 21 07:12:13.743046 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:13.743018 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 07:12:13.743151 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:13.743021 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-mgh2r\"" Apr 21 07:12:13.751099 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:13.751062 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd"] Apr 21 07:12:13.787013 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:13.786984 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a7742e71-92d2-4a6f-bf74-3e022bf70298-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lh6gd\" (UID: \"a7742e71-92d2-4a6f-bf74-3e022bf70298\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" Apr 21 07:12:13.887562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:13.887525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a7742e71-92d2-4a6f-bf74-3e022bf70298-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lh6gd\" (UID: \"a7742e71-92d2-4a6f-bf74-3e022bf70298\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" Apr 21 07:12:13.887736 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:12:13.887688 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 07:12:13.887807 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:12:13.887789 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7742e71-92d2-4a6f-bf74-3e022bf70298-tls-certificates podName:a7742e71-92d2-4a6f-bf74-3e022bf70298 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:14.38776734 +0000 UTC m=+69.925654260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/a7742e71-92d2-4a6f-bf74-3e022bf70298-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-lh6gd" (UID: "a7742e71-92d2-4a6f-bf74-3e022bf70298") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 07:12:14.240508 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.240471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fqxn4" event={"ID":"61710589-be37-470a-8046-39c730b38313","Type":"ContainerStarted","Data":"f3b4080057f6fce23815f15f45ff6c5712e469f791b57c35a747e46edcb265ae"} Apr 21 07:12:14.240508 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.240511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fqxn4" event={"ID":"61710589-be37-470a-8046-39c730b38313","Type":"ContainerStarted","Data":"2f8fb89b1079d96c73a721d598e66f74aa30847a751224a5b6a4eda81fc451ce"} Apr 21 07:12:14.391060 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.391020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a7742e71-92d2-4a6f-bf74-3e022bf70298-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lh6gd\" (UID: \"a7742e71-92d2-4a6f-bf74-3e022bf70298\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" Apr 21 07:12:14.393552 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.393530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a7742e71-92d2-4a6f-bf74-3e022bf70298-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lh6gd\" (UID: \"a7742e71-92d2-4a6f-bf74-3e022bf70298\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" Apr 21 07:12:14.648660 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.648562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" Apr 21 07:12:14.744553 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.744500 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fqxn4" podStartSLOduration=68.532460102 podStartE2EDuration="1m9.744481717s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:12:12.07538034 +0000 UTC m=+67.613267258" lastFinishedPulling="2026-04-21 07:12:13.287401965 +0000 UTC m=+68.825288873" observedRunningTime="2026-04-21 07:12:14.295811249 +0000 UTC m=+69.833698173" watchObservedRunningTime="2026-04-21 07:12:14.744481717 +0000 UTC m=+70.282368639" Apr 21 07:12:14.744974 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.744792 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-dfb786946-shg4s"] Apr 21 07:12:14.755460 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.755437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.761526 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.761501 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-h2ctq\"" Apr 21 07:12:14.761851 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.761821 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 07:12:14.762831 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.762802 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 07:12:14.763338 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.763301 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 07:12:14.766729 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.764095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 07:12:14.766729 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.764206 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 07:12:14.766729 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.764306 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 07:12:14.766729 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.764423 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 07:12:14.768607 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.768582 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dfb786946-shg4s"] Apr 21 07:12:14.771980 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.771962 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 07:12:14.793785 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.793753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-oauth-config\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.793911 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.793836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-service-ca\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.793962 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.793908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkrj7\" (UniqueName: \"kubernetes.io/projected/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-kube-api-access-zkrj7\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.793962 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.793948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-oauth-serving-cert\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.794048 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.793987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-config\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.794048 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.794027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-trusted-ca-bundle\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.794125 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.794064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-serving-cert\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.894429 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.894393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkrj7\" (UniqueName: \"kubernetes.io/projected/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-kube-api-access-zkrj7\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.894429 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.894430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-oauth-serving-cert\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.894649 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.894450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-config\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.894649 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.894473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-trusted-ca-bundle\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.894649 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.894494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-serving-cert\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.894649 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.894513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-oauth-config\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.894649 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.894541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-service-ca\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.895092 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.895064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-oauth-serving-cert\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.895242 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.895225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-service-ca\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.895381 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.895361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-trusted-ca-bundle\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.896859 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.896838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-serving-cert\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.897012 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.896993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-oauth-config\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.902847 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.902802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-config\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:14.904578 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:14.904561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkrj7\" (UniqueName: \"kubernetes.io/projected/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-kube-api-access-zkrj7\") pod \"console-dfb786946-shg4s\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:15.061931 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:15.061900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd"] Apr 21 07:12:15.065859 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:15.065829 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7742e71_92d2_4a6f_bf74_3e022bf70298.slice/crio-270e52daa8baefab94598539fdb93dfc65e03d41670412add12c7a9523114b9d WatchSource:0}: Error finding container 270e52daa8baefab94598539fdb93dfc65e03d41670412add12c7a9523114b9d: Status 404 returned error can't find the container with id 270e52daa8baefab94598539fdb93dfc65e03d41670412add12c7a9523114b9d Apr 21 07:12:15.071397 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:15.071375 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:15.244199 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:15.244167 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" event={"ID":"a7742e71-92d2-4a6f-bf74-3e022bf70298","Type":"ContainerStarted","Data":"270e52daa8baefab94598539fdb93dfc65e03d41670412add12c7a9523114b9d"} Apr 21 07:12:15.301551 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:15.301525 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dfb786946-shg4s"] Apr 21 07:12:15.311020 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:15.310995 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5b9d91b_3209_46ce_aff9_c5b5a626a7e2.slice/crio-fc46db584ec68240b73f64f1687854cf16922c40a33e19651a5d384f77fd1e43 WatchSource:0}: Error finding container fc46db584ec68240b73f64f1687854cf16922c40a33e19651a5d384f77fd1e43: Status 404 returned error can't find the container with id fc46db584ec68240b73f64f1687854cf16922c40a33e19651a5d384f77fd1e43 Apr 21 07:12:16.249414 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:16.249269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zv2g2" event={"ID":"53e6ffe6-1b54-4a2a-8aa1-0a1d310df973","Type":"ContainerStarted","Data":"ca5734f3e89b4db1f45e01c6be53d9ea842f1d9f51579ddbf34c18f7eedda768"} Apr 21 07:12:16.250803 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:16.250761 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfb786946-shg4s" event={"ID":"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2","Type":"ContainerStarted","Data":"fc46db584ec68240b73f64f1687854cf16922c40a33e19651a5d384f77fd1e43"} Apr 21 07:12:16.277586 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:16.277529 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zv2g2" podStartSLOduration=68.174116096 podStartE2EDuration="1m11.277511462s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:12:12.075743508 +0000 UTC m=+67.613630408" lastFinishedPulling="2026-04-21 07:12:15.17913885 +0000 UTC m=+70.717025774" observedRunningTime="2026-04-21 07:12:16.276336797 +0000 UTC m=+71.814223720" watchObservedRunningTime="2026-04-21 07:12:16.277511462 +0000 UTC m=+71.815398386" Apr 21 07:12:17.255489 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.255442 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" event={"ID":"a7742e71-92d2-4a6f-bf74-3e022bf70298","Type":"ContainerStarted","Data":"2f217a82f81a7ffc61b5bce2b7749d9404737000287801eaafa1fd69b01c96c4"} Apr 21 07:12:17.255928 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.255602 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:12:17.255928 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.255664 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" Apr 21 07:12:17.262182 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.262158 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" Apr 21 07:12:17.273077 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.273038 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lh6gd" podStartSLOduration=2.658406696 podStartE2EDuration="4.273022954s" podCreationTimestamp="2026-04-21 07:12:13 +0000 UTC" firstStartedPulling="2026-04-21 07:12:15.067852489 +0000 UTC m=+70.605739390" lastFinishedPulling="2026-04-21 07:12:16.682468747 +0000 UTC m=+72.220355648" observedRunningTime="2026-04-21 07:12:17.272186337 +0000 UTC m=+72.810073262" watchObservedRunningTime="2026-04-21 07:12:17.273022954 +0000 UTC m=+72.810909877" Apr 21 07:12:17.831668 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.831374 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t9dkm"] Apr 21 07:12:17.840171 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.840142 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:17.845290 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.845250 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t9dkm"] Apr 21 07:12:17.846398 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.846373 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 07:12:17.846637 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.846614 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-v9l5n\"" Apr 21 07:12:17.846748 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.846655 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 07:12:17.847169 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.846867 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 07:12:17.847169 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.846895 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 07:12:17.847169 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.846992 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 07:12:17.912918 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.912876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:17.913065 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.912935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfwm7\" (UniqueName: \"kubernetes.io/projected/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-kube-api-access-rfwm7\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:17.913065 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.912989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:17.913065 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:17.913032 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.013571 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.013531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.013571 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.013569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.013748 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.013603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.013748 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.013630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfwm7\" (UniqueName: \"kubernetes.io/projected/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-kube-api-access-rfwm7\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.013748 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:12:18.013709 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 07:12:18.013867 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:12:18.013781 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-prometheus-operator-tls podName:bffb6daf-390e-4b76-88a9-1d5c2be7efa8 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:18.513759502 +0000 UTC m=+74.051646413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-t9dkm" (UID: "bffb6daf-390e-4b76-88a9-1d5c2be7efa8") : secret "prometheus-operator-tls" not found Apr 21 07:12:18.014222 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.014204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.015869 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.015848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.024689 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.024662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfwm7\" (UniqueName: \"kubernetes.io/projected/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-kube-api-access-rfwm7\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.259563 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.259528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfb786946-shg4s" event={"ID":"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2","Type":"ContainerStarted","Data":"a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992"} Apr 21 07:12:18.517802 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.517759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.520102 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.520069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bffb6daf-390e-4b76-88a9-1d5c2be7efa8-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t9dkm\" (UID: \"bffb6daf-390e-4b76-88a9-1d5c2be7efa8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.750474 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.750440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" Apr 21 07:12:18.864644 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.864534 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dfb786946-shg4s" podStartSLOduration=2.128662324 podStartE2EDuration="4.86451169s" podCreationTimestamp="2026-04-21 07:12:14 +0000 UTC" firstStartedPulling="2026-04-21 07:12:15.31275233 +0000 UTC m=+70.850639234" lastFinishedPulling="2026-04-21 07:12:18.048601688 +0000 UTC m=+73.586488600" observedRunningTime="2026-04-21 07:12:18.29833637 +0000 UTC m=+73.836223290" watchObservedRunningTime="2026-04-21 07:12:18.86451169 +0000 UTC m=+74.402398615" Apr 21 07:12:18.865425 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:18.865399 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t9dkm"] Apr 21 07:12:18.868688 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:18.868665 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbffb6daf_390e_4b76_88a9_1d5c2be7efa8.slice/crio-38ef5ffde51311c8db02d774368ca65ffb4214eaf23b2c2a2b3793697460ad82 WatchSource:0}: Error finding container 38ef5ffde51311c8db02d774368ca65ffb4214eaf23b2c2a2b3793697460ad82: Status 404 returned error can't find the container with id 38ef5ffde51311c8db02d774368ca65ffb4214eaf23b2c2a2b3793697460ad82 Apr 21 07:12:19.263661 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:19.263626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" event={"ID":"bffb6daf-390e-4b76-88a9-1d5c2be7efa8","Type":"ContainerStarted","Data":"38ef5ffde51311c8db02d774368ca65ffb4214eaf23b2c2a2b3793697460ad82"} Apr 21 07:12:20.268588 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:20.268549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" event={"ID":"bffb6daf-390e-4b76-88a9-1d5c2be7efa8","Type":"ContainerStarted","Data":"043950531c4afa25724c9bd1e8f66d1a3dd01d5481cef50d70e7b3f2acfb3d3f"} Apr 21 07:12:20.268990 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:20.268594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" event={"ID":"bffb6daf-390e-4b76-88a9-1d5c2be7efa8","Type":"ContainerStarted","Data":"a86bfc6ccdf72ff71e914cd30df89521c2f87e970d3b2ffd9db1138d72feab6c"} Apr 21 07:12:20.292724 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:20.292681 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-t9dkm" podStartSLOduration=2.086426554 podStartE2EDuration="3.292668297s" podCreationTimestamp="2026-04-21 07:12:17 +0000 UTC" firstStartedPulling="2026-04-21 07:12:18.870575734 +0000 UTC m=+74.408462648" lastFinishedPulling="2026-04-21 07:12:20.076817476 +0000 UTC m=+75.614704391" observedRunningTime="2026-04-21 07:12:20.291221826 +0000 UTC m=+75.829108749" watchObservedRunningTime="2026-04-21 07:12:20.292668297 +0000 UTC m=+75.830555241" Apr 21 07:12:22.195629 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.195592 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw"] Apr 21 07:12:22.198809 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.198793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.201377 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.201358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 07:12:22.201693 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.201677 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 07:12:22.201741 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.201682 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5jr9l\"" Apr 21 07:12:22.216021 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.215998 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw"] Apr 21 07:12:22.224021 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.223998 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xtsqs"] Apr 21 07:12:22.227100 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.227082 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.229502 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.229474 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k9dbk\"" Apr 21 07:12:22.229668 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.229650 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 07:12:22.229834 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.229817 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 07:12:22.229906 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.229881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 07:12:22.237666 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.237649 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gv622" Apr 21 07:12:22.245012 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.244989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.245094 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.245023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.245094 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.245043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.245094 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.245060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlzxw\" (UniqueName: \"kubernetes.io/projected/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-kube-api-access-hlzxw\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.345720 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.345689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3a983a34-5d35-4c65-acc4-d7568c214769-sys\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.345880 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.345748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88bqb\" (UniqueName: \"kubernetes.io/projected/3a983a34-5d35-4c65-acc4-d7568c214769-kube-api-access-88bqb\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.345880 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.345778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.345880 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.345864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a983a34-5d35-4c65-acc4-d7568c214769-metrics-client-ca\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.346055 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.345929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-tls\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.346055 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.345974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3a983a34-5d35-4c65-acc4-d7568c214769-root\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.346055 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.346004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.346055 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.346038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-textfile\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.346214 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.346074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.346214 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.346145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.346326 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.346219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.346326 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.346245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlzxw\" (UniqueName: \"kubernetes.io/projected/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-kube-api-access-hlzxw\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.346326 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:12:22.346297 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 07:12:22.346463 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:12:22.346363 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-openshift-state-metrics-tls podName:c1073f55-68d9-442d-9f9f-9e6ee5b25d39 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:22.846340764 +0000 UTC m=+78.384227667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-98zlw" (UID: "c1073f55-68d9-442d-9f9f-9e6ee5b25d39") : secret "openshift-state-metrics-tls" not found Apr 21 07:12:22.346463 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.346301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-wtmp\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.346857 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.346838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.348460 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.348443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.354990 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.354970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlzxw\" (UniqueName: \"kubernetes.io/projected/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-kube-api-access-hlzxw\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.447244 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-textfile\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447244 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-wtmp\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447421 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3a983a34-5d35-4c65-acc4-d7568c214769-sys\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447421 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88bqb\" (UniqueName: \"kubernetes.io/projected/3a983a34-5d35-4c65-acc4-d7568c214769-kube-api-access-88bqb\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447421 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447421 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a983a34-5d35-4c65-acc4-d7568c214769-metrics-client-ca\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447421 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-tls\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447421 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3a983a34-5d35-4c65-acc4-d7568c214769-root\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447719 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3a983a34-5d35-4c65-acc4-d7568c214769-sys\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447719 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447719 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-wtmp\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447719 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3a983a34-5d35-4c65-acc4-d7568c214769-root\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.447719 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-textfile\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.448028 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.447900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.448084 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.448038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a983a34-5d35-4c65-acc4-d7568c214769-metrics-client-ca\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.449658 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.449637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.449777 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.449743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3a983a34-5d35-4c65-acc4-d7568c214769-node-exporter-tls\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.462178 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.462150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88bqb\" (UniqueName: \"kubernetes.io/projected/3a983a34-5d35-4c65-acc4-d7568c214769-kube-api-access-88bqb\") pod \"node-exporter-xtsqs\" (UID: \"3a983a34-5d35-4c65-acc4-d7568c214769\") " pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.535941 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.535912 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtsqs" Apr 21 07:12:22.543830 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:22.543806 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a983a34_5d35_4c65_acc4_d7568c214769.slice/crio-aa01d438829c1fbafc633e9d6da1fd920d1998004aacbb23175785d06f8b1192 WatchSource:0}: Error finding container aa01d438829c1fbafc633e9d6da1fd920d1998004aacbb23175785d06f8b1192: Status 404 returned error can't find the container with id aa01d438829c1fbafc633e9d6da1fd920d1998004aacbb23175785d06f8b1192 Apr 21 07:12:22.850548 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.850513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:22.852772 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:22.852742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1073f55-68d9-442d-9f9f-9e6ee5b25d39-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-98zlw\" (UID: \"c1073f55-68d9-442d-9f9f-9e6ee5b25d39\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:23.107911 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.107846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" Apr 21 07:12:23.226907 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.226867 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw"] Apr 21 07:12:23.278749 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.278705 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtsqs" event={"ID":"3a983a34-5d35-4c65-acc4-d7568c214769","Type":"ContainerStarted","Data":"aa01d438829c1fbafc633e9d6da1fd920d1998004aacbb23175785d06f8b1192"} Apr 21 07:12:23.303238 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.303208 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:12:23.338841 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.338804 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:12:23.338986 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.338972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.342043 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.341969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 07:12:23.342219 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.342075 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 07:12:23.342402 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.342379 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 07:12:23.342515 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.342490 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 07:12:23.343099 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.342623 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-bzwvx\"" Apr 21 07:12:23.343099 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.342663 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 07:12:23.343099 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.342748 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 07:12:23.343099 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.342817 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 07:12:23.343099 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.342869 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 07:12:23.343099 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.342947 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 07:12:23.422906 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:23.422836 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1073f55_68d9_442d_9f9f_9e6ee5b25d39.slice/crio-a50c81645f44eea5061998052f88ea8386211c9c42c12e58d97ee5c1a7c0b78f WatchSource:0}: Error finding container a50c81645f44eea5061998052f88ea8386211c9c42c12e58d97ee5c1a7c0b78f: Status 404 returned error can't find the container with id a50c81645f44eea5061998052f88ea8386211c9c42c12e58d97ee5c1a7c0b78f Apr 21 07:12:23.455460 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1181156a-7ac2-4191-829a-c08fd1305d38-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455552 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1181156a-7ac2-4191-829a-c08fd1305d38-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455552 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455552 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455672 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1181156a-7ac2-4191-829a-c08fd1305d38-config-out\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455672 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181156a-7ac2-4191-829a-c08fd1305d38-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455672 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455672 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-config-volume\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455804 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1181156a-7ac2-4191-829a-c08fd1305d38-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455804 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455804 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455911 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd6v6\" (UniqueName: \"kubernetes.io/projected/1181156a-7ac2-4191-829a-c08fd1305d38-kube-api-access-gd6v6\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.455911 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.455827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-web-config\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557068 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557171 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557171 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd6v6\" (UniqueName: \"kubernetes.io/projected/1181156a-7ac2-4191-829a-c08fd1305d38-kube-api-access-gd6v6\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557171 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-web-config\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557338 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1181156a-7ac2-4191-829a-c08fd1305d38-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557392 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1181156a-7ac2-4191-829a-c08fd1305d38-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557446 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557446 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557446 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1181156a-7ac2-4191-829a-c08fd1305d38-config-out\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557578 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181156a-7ac2-4191-829a-c08fd1305d38-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557578 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557578 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-config-volume\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.557695 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.557578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1181156a-7ac2-4191-829a-c08fd1305d38-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.562413 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.562389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1181156a-7ac2-4191-829a-c08fd1305d38-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.562747 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.562722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1181156a-7ac2-4191-829a-c08fd1305d38-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.563820 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.563742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181156a-7ac2-4191-829a-c08fd1305d38-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.564840 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.564632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1181156a-7ac2-4191-829a-c08fd1305d38-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.565438 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.564972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-web-config\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.565438 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.565017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.565438 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.565403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.574204 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.574178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1181156a-7ac2-4191-829a-c08fd1305d38-config-out\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.574364 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.574346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.574444 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.574350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.574444 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.574377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.574444 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.574391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1181156a-7ac2-4191-829a-c08fd1305d38-config-volume\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.574444 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.574414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd6v6\" (UniqueName: \"kubernetes.io/projected/1181156a-7ac2-4191-829a-c08fd1305d38-kube-api-access-gd6v6\") pod \"alertmanager-main-0\" (UID: \"1181156a-7ac2-4191-829a-c08fd1305d38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.660197 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.660168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:12:23.799737 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:23.799713 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:12:23.801977 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:23.801951 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1181156a_7ac2_4191_829a_c08fd1305d38.slice/crio-7f2934c2e6aa33eba2e9e7918a8e304de68f875b895bc42d5b126e4154911823 WatchSource:0}: Error finding container 7f2934c2e6aa33eba2e9e7918a8e304de68f875b895bc42d5b126e4154911823: Status 404 returned error can't find the container with id 7f2934c2e6aa33eba2e9e7918a8e304de68f875b895bc42d5b126e4154911823 Apr 21 07:12:24.282682 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:24.282644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" event={"ID":"c1073f55-68d9-442d-9f9f-9e6ee5b25d39","Type":"ContainerStarted","Data":"fbdf2fe0b1ef5ad709ba7ee326648343c333c144cd284d9c988aaf55ba114a02"} Apr 21 07:12:24.282682 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:24.282681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" event={"ID":"c1073f55-68d9-442d-9f9f-9e6ee5b25d39","Type":"ContainerStarted","Data":"53879138b8cef86174fdeac221afb719cd6009c88957ced9db43460e1476c707"} Apr 21 07:12:24.283127 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:24.282691 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" event={"ID":"c1073f55-68d9-442d-9f9f-9e6ee5b25d39","Type":"ContainerStarted","Data":"a50c81645f44eea5061998052f88ea8386211c9c42c12e58d97ee5c1a7c0b78f"} Apr 21 07:12:24.283985 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:24.283963 2576 generic.go:358] "Generic (PLEG): container finished" podID="3a983a34-5d35-4c65-acc4-d7568c214769" containerID="a293695bfd02e6fde4c29dfcd08239171946435411e676d751b18cf55ac9dc32" exitCode=0 Apr 21 07:12:24.284051 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:24.284034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtsqs" event={"ID":"3a983a34-5d35-4c65-acc4-d7568c214769","Type":"ContainerDied","Data":"a293695bfd02e6fde4c29dfcd08239171946435411e676d751b18cf55ac9dc32"} Apr 21 07:12:24.284980 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:24.284962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1181156a-7ac2-4191-829a-c08fd1305d38","Type":"ContainerStarted","Data":"7f2934c2e6aa33eba2e9e7918a8e304de68f875b895bc42d5b126e4154911823"} Apr 21 07:12:25.072380 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.072344 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:25.072561 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.072392 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:25.078029 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.078003 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:25.200082 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.200049 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7fb796db59-2l7pf"] Apr 21 07:12:25.203485 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.203467 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.206404 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.206346 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 07:12:25.206532 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.206450 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-9qdqcke0pkg34\"" Apr 21 07:12:25.206532 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.206458 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-ph9gs\"" Apr 21 07:12:25.206532 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.206504 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 07:12:25.206532 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.206527 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 07:12:25.206746 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.206559 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 07:12:25.206746 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.206599 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 07:12:25.221787 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.221764 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fb796db59-2l7pf"] Apr 21 07:12:25.270052 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.270029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.270153 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.270072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.270153 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.270103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12a53388-3478-4c2a-9bfc-987133408fc6-metrics-client-ca\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.270286 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.270169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.270286 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.270222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.270286 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.270241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gprzv\" (UniqueName: \"kubernetes.io/projected/12a53388-3478-4c2a-9bfc-987133408fc6-kube-api-access-gprzv\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.270441 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.270302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-tls\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.270441 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.270328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-grpc-tls\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.291978 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.291949 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtsqs" event={"ID":"3a983a34-5d35-4c65-acc4-d7568c214769","Type":"ContainerStarted","Data":"2825c3c6ca49b9fbbd57f32b4971a4c954b92888623b24b8b55dbd1a0fbc9f20"} Apr 21 07:12:25.292367 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.291989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtsqs" event={"ID":"3a983a34-5d35-4c65-acc4-d7568c214769","Type":"ContainerStarted","Data":"97223dd6835d4637b89e0707e22ffac562bba6c96fbc8d2a12b788b4086693a5"} Apr 21 07:12:25.298801 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.298778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:12:25.334002 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.333927 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xtsqs" podStartSLOduration=2.202313381 podStartE2EDuration="3.333913611s" podCreationTimestamp="2026-04-21 07:12:22 +0000 UTC" firstStartedPulling="2026-04-21 07:12:22.545351867 +0000 UTC m=+78.083238769" lastFinishedPulling="2026-04-21 07:12:23.676952096 +0000 UTC m=+79.214838999" observedRunningTime="2026-04-21 07:12:25.331884128 +0000 UTC m=+80.869771055" watchObservedRunningTime="2026-04-21 07:12:25.333913611 +0000 UTC m=+80.871800534" Apr 21 07:12:25.370660 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.370624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.371028 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.371005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.371093 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.371049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12a53388-3478-4c2a-9bfc-987133408fc6-metrics-client-ca\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.371296 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.371211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.372075 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.371378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.372075 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.371413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gprzv\" (UniqueName: \"kubernetes.io/projected/12a53388-3478-4c2a-9bfc-987133408fc6-kube-api-access-gprzv\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.372075 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.371486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-tls\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.372075 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.371524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-grpc-tls\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.373026 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.373003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12a53388-3478-4c2a-9bfc-987133408fc6-metrics-client-ca\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.373544 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.373519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.374105 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.374077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.374820 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.374800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-tls\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.374820 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.374809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-grpc-tls\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.375133 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.375110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.375345 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.375285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12a53388-3478-4c2a-9bfc-987133408fc6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.386683 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.386663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gprzv\" (UniqueName: \"kubernetes.io/projected/12a53388-3478-4c2a-9bfc-987133408fc6-kube-api-access-gprzv\") pod \"thanos-querier-7fb796db59-2l7pf\" (UID: \"12a53388-3478-4c2a-9bfc-987133408fc6\") " pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.513293 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.513241 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:25.629850 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:25.629736 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fb796db59-2l7pf"] Apr 21 07:12:25.633700 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:25.633671 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12a53388_3478_4c2a_9bfc_987133408fc6.slice/crio-864ac959f5dc9e082b59a5eabed1fa523013c24472a164bf6903fb81066381f2 WatchSource:0}: Error finding container 864ac959f5dc9e082b59a5eabed1fa523013c24472a164bf6903fb81066381f2: Status 404 returned error can't find the container with id 864ac959f5dc9e082b59a5eabed1fa523013c24472a164bf6903fb81066381f2 Apr 21 07:12:26.296982 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:26.296944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" event={"ID":"12a53388-3478-4c2a-9bfc-987133408fc6","Type":"ContainerStarted","Data":"864ac959f5dc9e082b59a5eabed1fa523013c24472a164bf6903fb81066381f2"} Apr 21 07:12:26.298208 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:26.298185 2576 generic.go:358] "Generic (PLEG): container finished" podID="1181156a-7ac2-4191-829a-c08fd1305d38" containerID="f3caefcf737e4263188a66c42cf1269a57fb7e110cfa95aa219f12c742a3bf66" exitCode=0 Apr 21 07:12:26.298337 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:26.298285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1181156a-7ac2-4191-829a-c08fd1305d38","Type":"ContainerDied","Data":"f3caefcf737e4263188a66c42cf1269a57fb7e110cfa95aa219f12c742a3bf66"} Apr 21 07:12:26.300154 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:26.300134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" event={"ID":"c1073f55-68d9-442d-9f9f-9e6ee5b25d39","Type":"ContainerStarted","Data":"b5c315cb42731ce30078deb83418d27954f8c519e606e38aa1158f0108df4c97"} Apr 21 07:12:26.349078 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:26.349029 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-98zlw" podStartSLOduration=2.8061650399999998 podStartE2EDuration="4.349016769s" podCreationTimestamp="2026-04-21 07:12:22 +0000 UTC" firstStartedPulling="2026-04-21 07:12:23.67168111 +0000 UTC m=+79.209568011" lastFinishedPulling="2026-04-21 07:12:25.21453284 +0000 UTC m=+80.752419740" observedRunningTime="2026-04-21 07:12:26.347931166 +0000 UTC m=+81.885818103" watchObservedRunningTime="2026-04-21 07:12:26.349016769 +0000 UTC m=+81.886903691" Apr 21 07:12:27.006222 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.006192 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl"] Apr 21 07:12:27.009113 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.009093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" Apr 21 07:12:27.012095 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.012069 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 07:12:27.012199 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.012147 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-9n24z\"" Apr 21 07:12:27.029841 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.029813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl"] Apr 21 07:12:27.086620 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.086590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22de7198-baa4-4ff3-a818-f14e9fcbe5e7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hzvzl\" (UID: \"22de7198-baa4-4ff3-a818-f14e9fcbe5e7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" Apr 21 07:12:27.187551 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.187513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22de7198-baa4-4ff3-a818-f14e9fcbe5e7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hzvzl\" (UID: \"22de7198-baa4-4ff3-a818-f14e9fcbe5e7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" Apr 21 07:12:27.187692 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:12:27.187645 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 07:12:27.187732 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:12:27.187704 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22de7198-baa4-4ff3-a818-f14e9fcbe5e7-monitoring-plugin-cert podName:22de7198-baa4-4ff3-a818-f14e9fcbe5e7 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:27.687688021 +0000 UTC m=+83.225574921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/22de7198-baa4-4ff3-a818-f14e9fcbe5e7-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-hzvzl" (UID: "22de7198-baa4-4ff3-a818-f14e9fcbe5e7") : secret "monitoring-plugin-cert" not found Apr 21 07:12:27.691598 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.691553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22de7198-baa4-4ff3-a818-f14e9fcbe5e7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hzvzl\" (UID: \"22de7198-baa4-4ff3-a818-f14e9fcbe5e7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" Apr 21 07:12:27.694063 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.694038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22de7198-baa4-4ff3-a818-f14e9fcbe5e7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hzvzl\" (UID: \"22de7198-baa4-4ff3-a818-f14e9fcbe5e7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" Apr 21 07:12:27.921311 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:27.921279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" Apr 21 07:12:28.624045 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:28.623884 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl"] Apr 21 07:12:28.628308 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:12:28.628279 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22de7198_baa4_4ff3_a818_f14e9fcbe5e7.slice/crio-53dad657ed7903c060ff16715a7a469ed9412749fd5cc8865d8e6b875d09bfb7 WatchSource:0}: Error finding container 53dad657ed7903c060ff16715a7a469ed9412749fd5cc8865d8e6b875d09bfb7: Status 404 returned error can't find the container with id 53dad657ed7903c060ff16715a7a469ed9412749fd5cc8865d8e6b875d09bfb7 Apr 21 07:12:29.311011 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:29.310976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" event={"ID":"12a53388-3478-4c2a-9bfc-987133408fc6","Type":"ContainerStarted","Data":"f55f9abd6c5eedf8926cbdf291b28635aae4ff63628bc1a22c695cce743e1507"} Apr 21 07:12:29.311532 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:29.311013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" event={"ID":"12a53388-3478-4c2a-9bfc-987133408fc6","Type":"ContainerStarted","Data":"cba766125687c3944d174d4541f5b7469edb207d35dff4d29399bbc1e4bcfb17"} Apr 21 07:12:29.311532 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:29.311031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" event={"ID":"12a53388-3478-4c2a-9bfc-987133408fc6","Type":"ContainerStarted","Data":"be61c77ecf3a4a68a62674301f645dbfc82a36b5f4ba11b8bce045190dcf946a"} Apr 21 07:12:29.313849 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:29.313817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1181156a-7ac2-4191-829a-c08fd1305d38","Type":"ContainerStarted","Data":"2c885791796d5978d8e42aefbfb58f2a712bb95032a3881495050cffbe391f6c"} Apr 21 07:12:29.313849 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:29.313858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1181156a-7ac2-4191-829a-c08fd1305d38","Type":"ContainerStarted","Data":"3461cafe01abdccb8d22f3a336d441b454ae1f7b50e33cda7b1e595e2c5003f1"} Apr 21 07:12:29.314055 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:29.313868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1181156a-7ac2-4191-829a-c08fd1305d38","Type":"ContainerStarted","Data":"4bb2b6e4077657db9d21f61766a10eacc8751b45cdcde8ff3d6b8d633e531df2"} Apr 21 07:12:29.314055 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:29.313877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1181156a-7ac2-4191-829a-c08fd1305d38","Type":"ContainerStarted","Data":"a91438e092fcad82c6ccf6bef82ae8dc1f307b88c881d408f8b149bb259eac0c"} Apr 21 07:12:29.314055 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:29.313888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1181156a-7ac2-4191-829a-c08fd1305d38","Type":"ContainerStarted","Data":"3bd4f53f7d8fc9720567d4a2726af4bd52b57ebdd6094c7f25099ce5c44cd9a3"} Apr 21 07:12:29.314971 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:29.314933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" event={"ID":"22de7198-baa4-4ff3-a818-f14e9fcbe5e7","Type":"ContainerStarted","Data":"53dad657ed7903c060ff16715a7a469ed9412749fd5cc8865d8e6b875d09bfb7"} Apr 21 07:12:30.320714 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:30.320683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1181156a-7ac2-4191-829a-c08fd1305d38","Type":"ContainerStarted","Data":"54202357a436f4f0dad860672d0644032f16d2f172f2c781441d15ab7dcc8b0f"} Apr 21 07:12:30.322027 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:30.322003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" event={"ID":"22de7198-baa4-4ff3-a818-f14e9fcbe5e7","Type":"ContainerStarted","Data":"e1b968edf6e15ddf6e014ea2913d91d00e79483c328a40c24259054a61ea0423"} Apr 21 07:12:30.322165 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:30.322145 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" Apr 21 07:12:30.324352 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:30.324325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" event={"ID":"12a53388-3478-4c2a-9bfc-987133408fc6","Type":"ContainerStarted","Data":"24737a710b07bf993d42e43f6338491d0e374cd6376f9f5f7ebbc8fa3dc229ec"} Apr 21 07:12:30.324352 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:30.324349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" event={"ID":"12a53388-3478-4c2a-9bfc-987133408fc6","Type":"ContainerStarted","Data":"68c99a1fe56096ed439e2b0c6876e5ac7b4f28b1ac21bfff9ecd98061fc8848d"} Apr 21 07:12:30.327430 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:30.327399 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" Apr 21 07:12:30.351989 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:30.351845 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.035647053 podStartE2EDuration="7.351830616s" podCreationTimestamp="2026-04-21 07:12:23 +0000 UTC" firstStartedPulling="2026-04-21 07:12:23.80379686 +0000 UTC m=+79.341683761" lastFinishedPulling="2026-04-21 07:12:30.11998042 +0000 UTC m=+85.657867324" observedRunningTime="2026-04-21 07:12:30.350126005 +0000 UTC m=+85.888012927" watchObservedRunningTime="2026-04-21 07:12:30.351830616 +0000 UTC m=+85.889717539" Apr 21 07:12:30.368701 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:30.368608 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hzvzl" podStartSLOduration=2.879855746 podStartE2EDuration="4.368598616s" podCreationTimestamp="2026-04-21 07:12:26 +0000 UTC" firstStartedPulling="2026-04-21 07:12:28.630910488 +0000 UTC m=+84.168797392" lastFinishedPulling="2026-04-21 07:12:30.119653348 +0000 UTC m=+85.657540262" observedRunningTime="2026-04-21 07:12:30.367192444 +0000 UTC m=+85.905079367" watchObservedRunningTime="2026-04-21 07:12:30.368598616 +0000 UTC m=+85.906485538" Apr 21 07:12:31.331439 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:31.331399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" event={"ID":"12a53388-3478-4c2a-9bfc-987133408fc6","Type":"ContainerStarted","Data":"9ce82f3f17d29c5ec88ee801c89713431789a1bf4136a056fb34c3f18243b996"} Apr 21 07:12:31.357309 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:31.357251 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" podStartSLOduration=1.8711823509999999 podStartE2EDuration="6.357237402s" podCreationTimestamp="2026-04-21 07:12:25 +0000 UTC" firstStartedPulling="2026-04-21 07:12:25.635564559 +0000 UTC m=+81.173451460" lastFinishedPulling="2026-04-21 07:12:30.121619597 +0000 UTC m=+85.659506511" observedRunningTime="2026-04-21 07:12:31.355059181 +0000 UTC m=+86.892946104" watchObservedRunningTime="2026-04-21 07:12:31.357237402 +0000 UTC m=+86.895124336" Apr 21 07:12:32.334344 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:32.334316 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:33.344369 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:33.344335 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7fb796db59-2l7pf" Apr 21 07:12:38.747428 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:38.747393 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dfb786946-shg4s"] Apr 21 07:12:48.262156 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:12:48.262128 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zv2g2" Apr 21 07:13:03.766457 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:03.766409 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-dfb786946-shg4s" podUID="b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" containerName="console" containerID="cri-o://a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992" gracePeriod=15 Apr 21 07:13:04.005734 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.005711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dfb786946-shg4s_b5b9d91b-3209-46ce-aff9-c5b5a626a7e2/console/0.log" Apr 21 07:13:04.005853 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.005772 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:13:04.077513 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.077423 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-serving-cert\") pod \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " Apr 21 07:13:04.077513 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.077462 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-config\") pod \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " Apr 21 07:13:04.077513 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.077483 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-oauth-serving-cert\") pod \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " Apr 21 07:13:04.077513 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.077498 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-trusted-ca-bundle\") pod \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " Apr 21 07:13:04.077801 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.077642 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-service-ca\") pod \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " Apr 21 07:13:04.077801 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.077703 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkrj7\" (UniqueName: \"kubernetes.io/projected/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-kube-api-access-zkrj7\") pod \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " Apr 21 07:13:04.077961 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.077930 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-config" (OuterVolumeSpecName: "console-config") pod "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" (UID: "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:13:04.078016 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.077955 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" (UID: "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:13:04.078016 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.077990 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" (UID: "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:13:04.078103 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.078078 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-service-ca" (OuterVolumeSpecName: "service-ca") pod "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" (UID: "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:13:04.079811 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.079779 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" (UID: "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:13:04.079913 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.079833 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-kube-api-access-zkrj7" (OuterVolumeSpecName: "kube-api-access-zkrj7") pod "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" (UID: "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2"). InnerVolumeSpecName "kube-api-access-zkrj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:13:04.179004 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.178959 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-oauth-config\") pod \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\" (UID: \"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2\") " Apr 21 07:13:04.179197 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.179127 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-serving-cert\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:13:04.179197 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.179144 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-config\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:13:04.179197 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.179158 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-oauth-serving-cert\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:13:04.179197 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.179170 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-trusted-ca-bundle\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:13:04.179197 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.179182 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-service-ca\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:13:04.179197 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.179193 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkrj7\" (UniqueName: \"kubernetes.io/projected/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-kube-api-access-zkrj7\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:13:04.181003 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.180977 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" (UID: "b5b9d91b-3209-46ce-aff9-c5b5a626a7e2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:13:04.279744 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.279696 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2-console-oauth-config\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:13:04.422668 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.422599 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dfb786946-shg4s_b5b9d91b-3209-46ce-aff9-c5b5a626a7e2/console/0.log" Apr 21 07:13:04.422668 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.422645 2576 generic.go:358] "Generic (PLEG): container finished" podID="b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" containerID="a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992" exitCode=2 Apr 21 07:13:04.422822 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.422704 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfb786946-shg4s" Apr 21 07:13:04.422822 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.422716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfb786946-shg4s" event={"ID":"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2","Type":"ContainerDied","Data":"a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992"} Apr 21 07:13:04.422822 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.422745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfb786946-shg4s" event={"ID":"b5b9d91b-3209-46ce-aff9-c5b5a626a7e2","Type":"ContainerDied","Data":"fc46db584ec68240b73f64f1687854cf16922c40a33e19651a5d384f77fd1e43"} Apr 21 07:13:04.422822 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.422761 2576 scope.go:117] "RemoveContainer" containerID="a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992" Apr 21 07:13:04.431059 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.431041 2576 scope.go:117] "RemoveContainer" containerID="a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992" Apr 21 07:13:04.431332 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:13:04.431311 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992\": container with ID starting with a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992 not found: ID does not exist" containerID="a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992" Apr 21 07:13:04.431391 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.431344 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992"} err="failed to get container status \"a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992\": rpc error: code = NotFound desc = could not find container \"a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992\": container with ID starting with a1bc7226a5f5aef406dd68874d0f9f40c73828e00c130c26823bfa767d023992 not found: ID does not exist" Apr 21 07:13:04.443489 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.443467 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dfb786946-shg4s"] Apr 21 07:13:04.447528 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.447507 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dfb786946-shg4s"] Apr 21 07:13:04.989387 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:04.989353 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" path="/var/lib/kubelet/pods/b5b9d91b-3209-46ce-aff9-c5b5a626a7e2/volumes" Apr 21 07:13:46.509629 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.509542 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7cbbdfdf76-klktm"] Apr 21 07:13:46.510117 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.509908 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" containerName="console" Apr 21 07:13:46.510117 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.509922 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" containerName="console" Apr 21 07:13:46.510117 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.509973 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b9d91b-3209-46ce-aff9-c5b5a626a7e2" containerName="console" Apr 21 07:13:46.513102 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.513077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.515542 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.515518 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 07:13:46.515730 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.515715 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 07:13:46.515947 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.515926 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 07:13:46.516073 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.515951 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 07:13:46.516073 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.516013 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5pshq\"" Apr 21 07:13:46.516073 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.516050 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 07:13:46.522614 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.522583 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 07:13:46.525439 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.525416 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7cbbdfdf76-klktm"] Apr 21 07:13:46.593673 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.593639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72e230-0cc4-4b1c-8c70-fb63244807b4-serving-certs-ca-bundle\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.593857 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.593679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv272\" (UniqueName: \"kubernetes.io/projected/2f72e230-0cc4-4b1c-8c70-fb63244807b4-kube-api-access-sv272\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.593857 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.593711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-federate-client-tls\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.593857 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.593766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-telemeter-client-tls\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.593857 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.593801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f72e230-0cc4-4b1c-8c70-fb63244807b4-metrics-client-ca\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.593857 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.593839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-secret-telemeter-client\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.594060 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.593865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72e230-0cc4-4b1c-8c70-fb63244807b4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.594060 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.593888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.695027 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.694991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-federate-client-tls\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.695027 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.695030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-telemeter-client-tls\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.695289 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.695062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f72e230-0cc4-4b1c-8c70-fb63244807b4-metrics-client-ca\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.695289 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.695104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-secret-telemeter-client\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.695289 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.695136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72e230-0cc4-4b1c-8c70-fb63244807b4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.695289 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.695166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.695289 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.695224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72e230-0cc4-4b1c-8c70-fb63244807b4-serving-certs-ca-bundle\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.695289 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.695272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv272\" (UniqueName: \"kubernetes.io/projected/2f72e230-0cc4-4b1c-8c70-fb63244807b4-kube-api-access-sv272\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.695908 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.695871 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f72e230-0cc4-4b1c-8c70-fb63244807b4-metrics-client-ca\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.696046 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.696015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72e230-0cc4-4b1c-8c70-fb63244807b4-serving-certs-ca-bundle\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.696178 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.696066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72e230-0cc4-4b1c-8c70-fb63244807b4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.697667 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.697642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-telemeter-client-tls\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.697845 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.697825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-federate-client-tls\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.697910 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.697855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-secret-telemeter-client\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.698083 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.698066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2f72e230-0cc4-4b1c-8c70-fb63244807b4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.703634 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.703614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv272\" (UniqueName: \"kubernetes.io/projected/2f72e230-0cc4-4b1c-8c70-fb63244807b4-kube-api-access-sv272\") pod \"telemeter-client-7cbbdfdf76-klktm\" (UID: \"2f72e230-0cc4-4b1c-8c70-fb63244807b4\") " pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.824773 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.824674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" Apr 21 07:13:46.947383 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:46.947357 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7cbbdfdf76-klktm"] Apr 21 07:13:46.949597 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:13:46.949556 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f72e230_0cc4_4b1c_8c70_fb63244807b4.slice/crio-b672833d70b98087b1891709054906dd8d9296ce99ae578038e3f35892908ff8 WatchSource:0}: Error finding container b672833d70b98087b1891709054906dd8d9296ce99ae578038e3f35892908ff8: Status 404 returned error can't find the container with id b672833d70b98087b1891709054906dd8d9296ce99ae578038e3f35892908ff8 Apr 21 07:13:47.541516 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:47.541479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" event={"ID":"2f72e230-0cc4-4b1c-8c70-fb63244807b4","Type":"ContainerStarted","Data":"b672833d70b98087b1891709054906dd8d9296ce99ae578038e3f35892908ff8"} Apr 21 07:13:49.549085 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:49.549046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" event={"ID":"2f72e230-0cc4-4b1c-8c70-fb63244807b4","Type":"ContainerStarted","Data":"af33be370f3150df0282cc7f6eeb1fece45c4984195e30dea2fafea6891dd919"} Apr 21 07:13:49.549085 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:49.549083 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" event={"ID":"2f72e230-0cc4-4b1c-8c70-fb63244807b4","Type":"ContainerStarted","Data":"bbb56ab2431e71278fd55c24098252da4308ca38075b0eab3ba0904c915c0054"} Apr 21 07:13:49.549085 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:49.549092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" event={"ID":"2f72e230-0cc4-4b1c-8c70-fb63244807b4","Type":"ContainerStarted","Data":"343f6ffb34713ebf89cb7fba6838eaca16cfbf4aaacddc062402595f7575ecc9"} Apr 21 07:13:49.579601 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:49.579544 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7cbbdfdf76-klktm" podStartSLOduration=1.4183696239999999 podStartE2EDuration="3.579528735s" podCreationTimestamp="2026-04-21 07:13:46 +0000 UTC" firstStartedPulling="2026-04-21 07:13:46.951529121 +0000 UTC m=+162.489416022" lastFinishedPulling="2026-04-21 07:13:49.112688213 +0000 UTC m=+164.650575133" observedRunningTime="2026-04-21 07:13:49.577640124 +0000 UTC m=+165.115527048" watchObservedRunningTime="2026-04-21 07:13:49.579528735 +0000 UTC m=+165.117415690" Apr 21 07:13:50.236734 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.236701 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76dbdc978-9stx4"] Apr 21 07:13:50.239891 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.239875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.242419 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.242398 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 07:13:50.242883 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.242864 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 07:13:50.243561 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.243546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-h2ctq\"" Apr 21 07:13:50.243647 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.243591 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 07:13:50.243917 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.243904 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 07:13:50.245729 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.245710 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 07:13:50.245831 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.245806 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 07:13:50.246242 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.246223 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 07:13:50.251201 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.251178 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 07:13:50.253518 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.253498 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76dbdc978-9stx4"] Apr 21 07:13:50.326096 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.326058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-oauth-config\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.326096 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.326094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-oauth-serving-cert\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.326362 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.326121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-trusted-ca-bundle\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.326362 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.326143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-config\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.326362 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.326175 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-serving-cert\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.326362 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.326193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-service-ca\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.326362 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.326242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h264v\" (UniqueName: \"kubernetes.io/projected/5cda5447-9e81-4344-952b-8e7afaa5ed8c-kube-api-access-h264v\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.426915 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.426880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-trusted-ca-bundle\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427060 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.426926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-config\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427060 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.426960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-serving-cert\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427060 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.426989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-service-ca\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427060 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.427019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h264v\" (UniqueName: \"kubernetes.io/projected/5cda5447-9e81-4344-952b-8e7afaa5ed8c-kube-api-access-h264v\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427241 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.427077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-oauth-config\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427241 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.427105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-oauth-serving-cert\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427743 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.427716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-config\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427855 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.427817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-service-ca\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427915 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.427887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-trusted-ca-bundle\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.427915 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.427902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-oauth-serving-cert\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.429537 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.429517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-serving-cert\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.429623 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.429594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-oauth-config\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.435848 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.435817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h264v\" (UniqueName: \"kubernetes.io/projected/5cda5447-9e81-4344-952b-8e7afaa5ed8c-kube-api-access-h264v\") pod \"console-76dbdc978-9stx4\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.549583 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.549479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:13:50.683381 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:50.683350 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76dbdc978-9stx4"] Apr 21 07:13:50.687015 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:13:50.686988 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cda5447_9e81_4344_952b_8e7afaa5ed8c.slice/crio-82aaa9abbb6ff7d63f3f48b01f958a248d53faea5578a5519166a5bf8a6adc73 WatchSource:0}: Error finding container 82aaa9abbb6ff7d63f3f48b01f958a248d53faea5578a5519166a5bf8a6adc73: Status 404 returned error can't find the container with id 82aaa9abbb6ff7d63f3f48b01f958a248d53faea5578a5519166a5bf8a6adc73 Apr 21 07:13:51.556307 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:51.556274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76dbdc978-9stx4" event={"ID":"5cda5447-9e81-4344-952b-8e7afaa5ed8c","Type":"ContainerStarted","Data":"366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815"} Apr 21 07:13:51.556307 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:51.556308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76dbdc978-9stx4" event={"ID":"5cda5447-9e81-4344-952b-8e7afaa5ed8c","Type":"ContainerStarted","Data":"82aaa9abbb6ff7d63f3f48b01f958a248d53faea5578a5519166a5bf8a6adc73"} Apr 21 07:13:51.575553 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:13:51.575499 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76dbdc978-9stx4" podStartSLOduration=1.57548485 podStartE2EDuration="1.57548485s" podCreationTimestamp="2026-04-21 07:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:13:51.574659871 +0000 UTC m=+167.112546795" watchObservedRunningTime="2026-04-21 07:13:51.57548485 +0000 UTC m=+167.113371773" Apr 21 07:14:00.550252 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.550212 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:14:00.550252 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.550253 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:14:00.554849 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.554826 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:14:00.585052 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.585031 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:14:00.705711 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.705670 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b898f946-rvqzw"] Apr 21 07:14:00.709095 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.709080 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.718478 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.718449 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b898f946-rvqzw"] Apr 21 07:14:00.812940 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.812862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-trusted-ca-bundle\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.812940 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.812912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-oauth-serving-cert\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.813104 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.812962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-console-oauth-config\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.813104 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.812992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-console-config\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.813104 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.813028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcnsg\" (UniqueName: \"kubernetes.io/projected/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-kube-api-access-rcnsg\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.813104 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.813058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-console-serving-cert\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.813104 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.813074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-service-ca\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.914405 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.914367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-oauth-serving-cert\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.914405 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.914407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-console-oauth-config\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.914595 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.914424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-console-config\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.914595 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.914562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcnsg\" (UniqueName: \"kubernetes.io/projected/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-kube-api-access-rcnsg\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.914663 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.914625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-console-serving-cert\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.914663 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.914648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-service-ca\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.914734 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.914720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-trusted-ca-bundle\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.915145 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.915122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-console-config\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.915316 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.915291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-oauth-serving-cert\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.915475 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.915458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-trusted-ca-bundle\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.915544 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.915461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-service-ca\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.916996 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.916970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-console-oauth-config\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.917098 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.917050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-console-serving-cert\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:00.923051 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:00.923028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcnsg\" (UniqueName: \"kubernetes.io/projected/6ca6a59b-767f-41c7-b6f7-6808aafeb7ca-kube-api-access-rcnsg\") pod \"console-b898f946-rvqzw\" (UID: \"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca\") " pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:01.019485 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:01.019451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:01.138475 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:01.138435 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b898f946-rvqzw"] Apr 21 07:14:01.141161 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:14:01.141129 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca6a59b_767f_41c7_b6f7_6808aafeb7ca.slice/crio-5f76cf6c01a2c668edd664e38fae52b527310ac9533e8b65b612f52baccf1a1b WatchSource:0}: Error finding container 5f76cf6c01a2c668edd664e38fae52b527310ac9533e8b65b612f52baccf1a1b: Status 404 returned error can't find the container with id 5f76cf6c01a2c668edd664e38fae52b527310ac9533e8b65b612f52baccf1a1b Apr 21 07:14:01.585083 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:01.585049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b898f946-rvqzw" event={"ID":"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca","Type":"ContainerStarted","Data":"27a994fdc199ac48cd7ed9b842363fff3f824276cf93b64a97a69f133495998a"} Apr 21 07:14:01.585553 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:01.585088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b898f946-rvqzw" event={"ID":"6ca6a59b-767f-41c7-b6f7-6808aafeb7ca","Type":"ContainerStarted","Data":"5f76cf6c01a2c668edd664e38fae52b527310ac9533e8b65b612f52baccf1a1b"} Apr 21 07:14:01.605177 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:01.604624 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b898f946-rvqzw" podStartSLOduration=1.604607941 podStartE2EDuration="1.604607941s" podCreationTimestamp="2026-04-21 07:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:14:01.602943739 +0000 UTC m=+177.140830656" watchObservedRunningTime="2026-04-21 07:14:01.604607941 +0000 UTC m=+177.142494869" Apr 21 07:14:11.020442 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:11.020413 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:11.020442 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:11.020453 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:11.025115 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:11.025094 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:11.618163 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:11.618134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b898f946-rvqzw" Apr 21 07:14:11.678450 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:11.678414 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76dbdc978-9stx4"] Apr 21 07:14:36.697720 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.697650 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76dbdc978-9stx4" podUID="5cda5447-9e81-4344-952b-8e7afaa5ed8c" containerName="console" containerID="cri-o://366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815" gracePeriod=15 Apr 21 07:14:36.936290 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.936252 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76dbdc978-9stx4_5cda5447-9e81-4344-952b-8e7afaa5ed8c/console/0.log" Apr 21 07:14:36.936404 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.936342 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:14:36.998450 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998426 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-oauth-config\") pod \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " Apr 21 07:14:36.998610 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998465 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-trusted-ca-bundle\") pod \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " Apr 21 07:14:36.998610 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998528 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-config\") pod \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " Apr 21 07:14:36.998610 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998556 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-service-ca\") pod \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " Apr 21 07:14:36.998610 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998579 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-oauth-serving-cert\") pod \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " Apr 21 07:14:36.998610 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998608 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-serving-cert\") pod \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " Apr 21 07:14:36.998866 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998626 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h264v\" (UniqueName: \"kubernetes.io/projected/5cda5447-9e81-4344-952b-8e7afaa5ed8c-kube-api-access-h264v\") pod \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\" (UID: \"5cda5447-9e81-4344-952b-8e7afaa5ed8c\") " Apr 21 07:14:36.998970 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998945 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5cda5447-9e81-4344-952b-8e7afaa5ed8c" (UID: "5cda5447-9e81-4344-952b-8e7afaa5ed8c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:14:36.999024 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998962 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-service-ca" (OuterVolumeSpecName: "service-ca") pod "5cda5447-9e81-4344-952b-8e7afaa5ed8c" (UID: "5cda5447-9e81-4344-952b-8e7afaa5ed8c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:14:36.999024 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.998990 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-config" (OuterVolumeSpecName: "console-config") pod "5cda5447-9e81-4344-952b-8e7afaa5ed8c" (UID: "5cda5447-9e81-4344-952b-8e7afaa5ed8c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:14:36.999371 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:36.999345 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5cda5447-9e81-4344-952b-8e7afaa5ed8c" (UID: "5cda5447-9e81-4344-952b-8e7afaa5ed8c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:14:37.000561 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.000536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5cda5447-9e81-4344-952b-8e7afaa5ed8c" (UID: "5cda5447-9e81-4344-952b-8e7afaa5ed8c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:14:37.000756 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.000736 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cda5447-9e81-4344-952b-8e7afaa5ed8c-kube-api-access-h264v" (OuterVolumeSpecName: "kube-api-access-h264v") pod "5cda5447-9e81-4344-952b-8e7afaa5ed8c" (UID: "5cda5447-9e81-4344-952b-8e7afaa5ed8c"). InnerVolumeSpecName "kube-api-access-h264v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:14:37.000844 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.000817 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5cda5447-9e81-4344-952b-8e7afaa5ed8c" (UID: "5cda5447-9e81-4344-952b-8e7afaa5ed8c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:14:37.099474 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.099437 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-trusted-ca-bundle\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:14:37.099474 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.099466 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-config\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:14:37.099474 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.099479 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-service-ca\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:14:37.099743 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.099492 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cda5447-9e81-4344-952b-8e7afaa5ed8c-oauth-serving-cert\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:14:37.099743 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.099507 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-serving-cert\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:14:37.099743 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.099521 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h264v\" (UniqueName: \"kubernetes.io/projected/5cda5447-9e81-4344-952b-8e7afaa5ed8c-kube-api-access-h264v\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:14:37.099743 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.099532 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cda5447-9e81-4344-952b-8e7afaa5ed8c-console-oauth-config\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:14:37.686186 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.686160 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76dbdc978-9stx4_5cda5447-9e81-4344-952b-8e7afaa5ed8c/console/0.log" Apr 21 07:14:37.686377 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.686200 2576 generic.go:358] "Generic (PLEG): container finished" podID="5cda5447-9e81-4344-952b-8e7afaa5ed8c" containerID="366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815" exitCode=2 Apr 21 07:14:37.686377 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.686283 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76dbdc978-9stx4" Apr 21 07:14:37.686377 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.686286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76dbdc978-9stx4" event={"ID":"5cda5447-9e81-4344-952b-8e7afaa5ed8c","Type":"ContainerDied","Data":"366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815"} Apr 21 07:14:37.686476 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.686387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76dbdc978-9stx4" event={"ID":"5cda5447-9e81-4344-952b-8e7afaa5ed8c","Type":"ContainerDied","Data":"82aaa9abbb6ff7d63f3f48b01f958a248d53faea5578a5519166a5bf8a6adc73"} Apr 21 07:14:37.686476 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.686404 2576 scope.go:117] "RemoveContainer" containerID="366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815" Apr 21 07:14:37.694664 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.694648 2576 scope.go:117] "RemoveContainer" containerID="366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815" Apr 21 07:14:37.694907 ip-10-0-139-104 kubenswrapper[2576]: E0421 07:14:37.694888 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815\": container with ID starting with 366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815 not found: ID does not exist" containerID="366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815" Apr 21 07:14:37.694979 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.694918 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815"} err="failed to get container status \"366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815\": rpc error: code = NotFound desc = could not find container \"366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815\": container with ID starting with 366f81ae15129731b6a82d16fa7f862d75e0fe2309603302d59fe4903784b815 not found: ID does not exist" Apr 21 07:14:37.710586 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.710566 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76dbdc978-9stx4"] Apr 21 07:14:37.716650 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:37.716630 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76dbdc978-9stx4"] Apr 21 07:14:38.990428 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:14:38.990396 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cda5447-9e81-4344-952b-8e7afaa5ed8c" path="/var/lib/kubelet/pods/5cda5447-9e81-4344-952b-8e7afaa5ed8c/volumes" Apr 21 07:15:04.680476 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.680443 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tfnqb/must-gather-7lq4f"] Apr 21 07:15:04.680851 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.680750 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cda5447-9e81-4344-952b-8e7afaa5ed8c" containerName="console" Apr 21 07:15:04.680851 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.680761 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cda5447-9e81-4344-952b-8e7afaa5ed8c" containerName="console" Apr 21 07:15:04.680851 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.680818 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cda5447-9e81-4344-952b-8e7afaa5ed8c" containerName="console" Apr 21 07:15:04.683582 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.683564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:04.686304 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.686281 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tfnqb\"/\"openshift-service-ca.crt\"" Apr 21 07:15:04.686392 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.686372 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tfnqb\"/\"kube-root-ca.crt\"" Apr 21 07:15:04.693378 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.693357 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfnqb/must-gather-7lq4f"] Apr 21 07:15:04.819506 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.819471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42e5595b-efb9-450c-ba32-473e209e9d93-must-gather-output\") pod \"must-gather-7lq4f\" (UID: \"42e5595b-efb9-450c-ba32-473e209e9d93\") " pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:04.819695 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.819515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jsvt\" (UniqueName: \"kubernetes.io/projected/42e5595b-efb9-450c-ba32-473e209e9d93-kube-api-access-4jsvt\") pod \"must-gather-7lq4f\" (UID: \"42e5595b-efb9-450c-ba32-473e209e9d93\") " pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:04.920184 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.920159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42e5595b-efb9-450c-ba32-473e209e9d93-must-gather-output\") pod \"must-gather-7lq4f\" (UID: \"42e5595b-efb9-450c-ba32-473e209e9d93\") " pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:04.920302 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.920194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jsvt\" (UniqueName: \"kubernetes.io/projected/42e5595b-efb9-450c-ba32-473e209e9d93-kube-api-access-4jsvt\") pod \"must-gather-7lq4f\" (UID: \"42e5595b-efb9-450c-ba32-473e209e9d93\") " pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:04.920461 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.920442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42e5595b-efb9-450c-ba32-473e209e9d93-must-gather-output\") pod \"must-gather-7lq4f\" (UID: \"42e5595b-efb9-450c-ba32-473e209e9d93\") " pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:04.929225 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.929209 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tfnqb\"/\"kube-root-ca.crt\"" Apr 21 07:15:04.939583 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.939535 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tfnqb\"/\"openshift-service-ca.crt\"" Apr 21 07:15:04.949269 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:04.949239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jsvt\" (UniqueName: \"kubernetes.io/projected/42e5595b-efb9-450c-ba32-473e209e9d93-kube-api-access-4jsvt\") pod \"must-gather-7lq4f\" (UID: \"42e5595b-efb9-450c-ba32-473e209e9d93\") " pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:05.001460 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:05.001431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:05.114092 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:05.114066 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfnqb/must-gather-7lq4f"] Apr 21 07:15:05.116950 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:15:05.116925 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42e5595b_efb9_450c_ba32_473e209e9d93.slice/crio-401d76fdbdaafbd35001f56cac45009921d153947fd944c0bca050470325bd2d WatchSource:0}: Error finding container 401d76fdbdaafbd35001f56cac45009921d153947fd944c0bca050470325bd2d: Status 404 returned error can't find the container with id 401d76fdbdaafbd35001f56cac45009921d153947fd944c0bca050470325bd2d Apr 21 07:15:05.766403 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:05.766364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" event={"ID":"42e5595b-efb9-450c-ba32-473e209e9d93","Type":"ContainerStarted","Data":"401d76fdbdaafbd35001f56cac45009921d153947fd944c0bca050470325bd2d"} Apr 21 07:15:10.783494 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:10.783455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" event={"ID":"42e5595b-efb9-450c-ba32-473e209e9d93","Type":"ContainerStarted","Data":"d43e39ed126ee340dd9bd20e26b19daf11de7e1a1aae451b4a99c81d9150f8e7"} Apr 21 07:15:11.787846 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:11.787808 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" event={"ID":"42e5595b-efb9-450c-ba32-473e209e9d93","Type":"ContainerStarted","Data":"7f90d2d8fd329577b2291007532b375a892ebd61617baf0354f2806fb907c0ec"} Apr 21 07:15:11.806095 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:11.806050 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" podStartSLOduration=2.280871692 podStartE2EDuration="7.806036618s" podCreationTimestamp="2026-04-21 07:15:04 +0000 UTC" firstStartedPulling="2026-04-21 07:15:05.118466893 +0000 UTC m=+240.656353794" lastFinishedPulling="2026-04-21 07:15:10.643631818 +0000 UTC m=+246.181518720" observedRunningTime="2026-04-21 07:15:11.80551634 +0000 UTC m=+247.343403264" watchObservedRunningTime="2026-04-21 07:15:11.806036618 +0000 UTC m=+247.343923541" Apr 21 07:15:18.809646 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:18.809615 2576 generic.go:358] "Generic (PLEG): container finished" podID="42e5595b-efb9-450c-ba32-473e209e9d93" containerID="d43e39ed126ee340dd9bd20e26b19daf11de7e1a1aae451b4a99c81d9150f8e7" exitCode=0 Apr 21 07:15:18.810111 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:18.809685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" event={"ID":"42e5595b-efb9-450c-ba32-473e209e9d93","Type":"ContainerDied","Data":"d43e39ed126ee340dd9bd20e26b19daf11de7e1a1aae451b4a99c81d9150f8e7"} Apr 21 07:15:18.810111 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:18.809996 2576 scope.go:117] "RemoveContainer" containerID="d43e39ed126ee340dd9bd20e26b19daf11de7e1a1aae451b4a99c81d9150f8e7" Apr 21 07:15:19.309151 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:19.309121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tfnqb_must-gather-7lq4f_42e5595b-efb9-450c-ba32-473e209e9d93/gather/0.log" Apr 21 07:15:22.783846 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:22.783820 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8kj2n_62022203-bfe8-44d8-b46f-c1828de0c5a4/global-pull-secret-syncer/0.log" Apr 21 07:15:22.896364 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:22.896333 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-f8dzv_554c2ac8-64c6-4da1-80ab-4059aee84a3e/konnectivity-agent/0.log" Apr 21 07:15:23.038479 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:23.038399 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-104.ec2.internal_3c93887b2602d2e26b42ce7ba4f7f773/haproxy/0.log" Apr 21 07:15:24.639478 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.639443 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tfnqb/must-gather-7lq4f"] Apr 21 07:15:24.639924 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.639768 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" containerName="copy" containerID="cri-o://7f90d2d8fd329577b2291007532b375a892ebd61617baf0354f2806fb907c0ec" gracePeriod=2 Apr 21 07:15:24.647052 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.647025 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tfnqb/must-gather-7lq4f"] Apr 21 07:15:24.828278 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.828232 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tfnqb_must-gather-7lq4f_42e5595b-efb9-450c-ba32-473e209e9d93/copy/0.log" Apr 21 07:15:24.828604 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.828584 2576 generic.go:358] "Generic (PLEG): container finished" podID="42e5595b-efb9-450c-ba32-473e209e9d93" containerID="7f90d2d8fd329577b2291007532b375a892ebd61617baf0354f2806fb907c0ec" exitCode=143 Apr 21 07:15:24.863457 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.863429 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tfnqb_must-gather-7lq4f_42e5595b-efb9-450c-ba32-473e209e9d93/copy/0.log" Apr 21 07:15:24.863760 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.863744 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:24.866284 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.866234 2576 status_manager.go:895] "Failed to get status for pod" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" err="pods \"must-gather-7lq4f\" is forbidden: User \"system:node:ip-10-0-139-104.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-tfnqb\": no relationship found between node 'ip-10-0-139-104.ec2.internal' and this object" Apr 21 07:15:24.885206 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.885180 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jsvt\" (UniqueName: \"kubernetes.io/projected/42e5595b-efb9-450c-ba32-473e209e9d93-kube-api-access-4jsvt\") pod \"42e5595b-efb9-450c-ba32-473e209e9d93\" (UID: \"42e5595b-efb9-450c-ba32-473e209e9d93\") " Apr 21 07:15:24.885305 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.885208 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42e5595b-efb9-450c-ba32-473e209e9d93-must-gather-output\") pod \"42e5595b-efb9-450c-ba32-473e209e9d93\" (UID: \"42e5595b-efb9-450c-ba32-473e209e9d93\") " Apr 21 07:15:24.885624 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.885600 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e5595b-efb9-450c-ba32-473e209e9d93-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "42e5595b-efb9-450c-ba32-473e209e9d93" (UID: "42e5595b-efb9-450c-ba32-473e209e9d93"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:15:24.887231 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.887203 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e5595b-efb9-450c-ba32-473e209e9d93-kube-api-access-4jsvt" (OuterVolumeSpecName: "kube-api-access-4jsvt") pod "42e5595b-efb9-450c-ba32-473e209e9d93" (UID: "42e5595b-efb9-450c-ba32-473e209e9d93"). InnerVolumeSpecName "kube-api-access-4jsvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:15:24.986225 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.986193 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4jsvt\" (UniqueName: \"kubernetes.io/projected/42e5595b-efb9-450c-ba32-473e209e9d93-kube-api-access-4jsvt\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:15:24.986225 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.986217 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42e5595b-efb9-450c-ba32-473e209e9d93-must-gather-output\") on node \"ip-10-0-139-104.ec2.internal\" DevicePath \"\"" Apr 21 07:15:24.989693 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.989667 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" path="/var/lib/kubelet/pods/42e5595b-efb9-450c-ba32-473e209e9d93/volumes" Apr 21 07:15:24.990163 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:24.990142 2576 status_manager.go:895] "Failed to get status for pod" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" err="pods \"must-gather-7lq4f\" is forbidden: User \"system:node:ip-10-0-139-104.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-tfnqb\": no relationship found between node 'ip-10-0-139-104.ec2.internal' and this object" Apr 21 07:15:25.832586 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:25.832559 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tfnqb_must-gather-7lq4f_42e5595b-efb9-450c-ba32-473e209e9d93/copy/0.log" Apr 21 07:15:25.832945 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:25.832907 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfnqb/must-gather-7lq4f" Apr 21 07:15:25.832945 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:25.832935 2576 scope.go:117] "RemoveContainer" containerID="7f90d2d8fd329577b2291007532b375a892ebd61617baf0354f2806fb907c0ec" Apr 21 07:15:25.841936 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:25.841877 2576 scope.go:117] "RemoveContainer" containerID="d43e39ed126ee340dd9bd20e26b19daf11de7e1a1aae451b4a99c81d9150f8e7" Apr 21 07:15:26.007877 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.007850 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1181156a-7ac2-4191-829a-c08fd1305d38/alertmanager/0.log" Apr 21 07:15:26.059857 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.059836 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1181156a-7ac2-4191-829a-c08fd1305d38/config-reloader/0.log" Apr 21 07:15:26.085635 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.085568 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1181156a-7ac2-4191-829a-c08fd1305d38/kube-rbac-proxy-web/0.log" Apr 21 07:15:26.111357 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.111338 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1181156a-7ac2-4191-829a-c08fd1305d38/kube-rbac-proxy/0.log" Apr 21 07:15:26.136615 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.136598 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1181156a-7ac2-4191-829a-c08fd1305d38/kube-rbac-proxy-metric/0.log" Apr 21 07:15:26.161750 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.161731 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1181156a-7ac2-4191-829a-c08fd1305d38/prom-label-proxy/0.log" Apr 21 07:15:26.189139 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.189122 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1181156a-7ac2-4191-829a-c08fd1305d38/init-config-reloader/0.log" Apr 21 07:15:26.363876 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.363800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-hzvzl_22de7198-baa4-4ff3-a818-f14e9fcbe5e7/monitoring-plugin/0.log" Apr 21 07:15:26.557566 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.557541 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtsqs_3a983a34-5d35-4c65-acc4-d7568c214769/node-exporter/0.log" Apr 21 07:15:26.580268 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.580239 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtsqs_3a983a34-5d35-4c65-acc4-d7568c214769/kube-rbac-proxy/0.log" Apr 21 07:15:26.605284 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.605245 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtsqs_3a983a34-5d35-4c65-acc4-d7568c214769/init-textfile/0.log" Apr 21 07:15:26.635672 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.635588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-98zlw_c1073f55-68d9-442d-9f9f-9e6ee5b25d39/kube-rbac-proxy-main/0.log" Apr 21 07:15:26.660655 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.660630 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-98zlw_c1073f55-68d9-442d-9f9f-9e6ee5b25d39/kube-rbac-proxy-self/0.log" Apr 21 07:15:26.682556 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.682532 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-98zlw_c1073f55-68d9-442d-9f9f-9e6ee5b25d39/openshift-state-metrics/0.log" Apr 21 07:15:26.909731 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.909645 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t9dkm_bffb6daf-390e-4b76-88a9-1d5c2be7efa8/prometheus-operator/0.log" Apr 21 07:15:26.931709 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.931663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t9dkm_bffb6daf-390e-4b76-88a9-1d5c2be7efa8/kube-rbac-proxy/0.log" Apr 21 07:15:26.955231 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:26.955191 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-lh6gd_a7742e71-92d2-4a6f-bf74-3e022bf70298/prometheus-operator-admission-webhook/0.log" Apr 21 07:15:27.004777 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:27.004754 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7cbbdfdf76-klktm_2f72e230-0cc4-4b1c-8c70-fb63244807b4/telemeter-client/0.log" Apr 21 07:15:27.036181 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:27.036160 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7cbbdfdf76-klktm_2f72e230-0cc4-4b1c-8c70-fb63244807b4/reload/0.log" Apr 21 07:15:27.068928 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:27.068908 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7cbbdfdf76-klktm_2f72e230-0cc4-4b1c-8c70-fb63244807b4/kube-rbac-proxy/0.log" Apr 21 07:15:27.108874 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:27.108851 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fb796db59-2l7pf_12a53388-3478-4c2a-9bfc-987133408fc6/thanos-query/0.log" Apr 21 07:15:27.142819 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:27.142789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fb796db59-2l7pf_12a53388-3478-4c2a-9bfc-987133408fc6/kube-rbac-proxy-web/0.log" Apr 21 07:15:27.170082 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:27.170000 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fb796db59-2l7pf_12a53388-3478-4c2a-9bfc-987133408fc6/kube-rbac-proxy/0.log" Apr 21 07:15:27.202616 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:27.202565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fb796db59-2l7pf_12a53388-3478-4c2a-9bfc-987133408fc6/prom-label-proxy/0.log" Apr 21 07:15:27.237890 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:27.237871 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fb796db59-2l7pf_12a53388-3478-4c2a-9bfc-987133408fc6/kube-rbac-proxy-rules/0.log" Apr 21 07:15:27.267327 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:27.267309 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fb796db59-2l7pf_12a53388-3478-4c2a-9bfc-987133408fc6/kube-rbac-proxy-metrics/0.log" Apr 21 07:15:28.900368 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.900338 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv"] Apr 21 07:15:28.900720 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.900673 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" containerName="gather" Apr 21 07:15:28.900720 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.900686 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" containerName="gather" Apr 21 07:15:28.900720 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.900695 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" containerName="copy" Apr 21 07:15:28.900720 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.900701 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" containerName="copy" Apr 21 07:15:28.900979 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.900747 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" containerName="copy" Apr 21 07:15:28.900979 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.900758 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e5595b-efb9-450c-ba32-473e209e9d93" containerName="gather" Apr 21 07:15:28.904791 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.904770 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:28.907622 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.907600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nhprw\"/\"openshift-service-ca.crt\"" Apr 21 07:15:28.907717 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.907600 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nhprw\"/\"default-dockercfg-p9c5q\"" Apr 21 07:15:28.908475 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.908456 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nhprw\"/\"kube-root-ca.crt\"" Apr 21 07:15:28.914095 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.913540 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv"] Apr 21 07:15:28.991891 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:28.991868 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b898f946-rvqzw_6ca6a59b-767f-41c7-b6f7-6808aafeb7ca/console/0.log" Apr 21 07:15:29.026903 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.026876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-podres\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.027029 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.026907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xz42\" (UniqueName: \"kubernetes.io/projected/c5158fab-e3d9-491d-8a96-c008e8e52dcc-kube-api-access-4xz42\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.027029 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.026954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-sys\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.027029 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.026990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-lib-modules\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.027029 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.027021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-proc\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.127328 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.127302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-podres\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.127430 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.127335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xz42\" (UniqueName: \"kubernetes.io/projected/c5158fab-e3d9-491d-8a96-c008e8e52dcc-kube-api-access-4xz42\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.127430 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.127373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-sys\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.127430 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.127399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-lib-modules\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.127430 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.127418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-proc\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.127562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.127452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-podres\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.127562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.127493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-sys\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.127562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.127525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-lib-modules\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.127562 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.127517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c5158fab-e3d9-491d-8a96-c008e8e52dcc-proc\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.137533 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.137510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xz42\" (UniqueName: \"kubernetes.io/projected/c5158fab-e3d9-491d-8a96-c008e8e52dcc-kube-api-access-4xz42\") pod \"perf-node-gather-daemonset-t5hmv\" (UID: \"c5158fab-e3d9-491d-8a96-c008e8e52dcc\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.234683 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.234646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.371068 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.371044 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv"] Apr 21 07:15:29.373031 ip-10-0-139-104 kubenswrapper[2576]: W0421 07:15:29.372999 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc5158fab_e3d9_491d_8a96_c008e8e52dcc.slice/crio-920315b5afd3bcdef13daaa1aab6d5a8bcb912dda47a687fc0467b551c636e30 WatchSource:0}: Error finding container 920315b5afd3bcdef13daaa1aab6d5a8bcb912dda47a687fc0467b551c636e30: Status 404 returned error can't find the container with id 920315b5afd3bcdef13daaa1aab6d5a8bcb912dda47a687fc0467b551c636e30 Apr 21 07:15:29.845654 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.845616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" event={"ID":"c5158fab-e3d9-491d-8a96-c008e8e52dcc","Type":"ContainerStarted","Data":"79b5de00849e250a1f90e8e7b7185757dcaf8f08b017b309cb9027c2de1df86f"} Apr 21 07:15:29.845654 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.845654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" event={"ID":"c5158fab-e3d9-491d-8a96-c008e8e52dcc","Type":"ContainerStarted","Data":"920315b5afd3bcdef13daaa1aab6d5a8bcb912dda47a687fc0467b551c636e30"} Apr 21 07:15:29.845858 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.845677 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:29.864348 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:29.864244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" podStartSLOduration=1.864228 podStartE2EDuration="1.864228s" podCreationTimestamp="2026-04-21 07:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:15:29.863365951 +0000 UTC m=+265.401252887" watchObservedRunningTime="2026-04-21 07:15:29.864228 +0000 UTC m=+265.402114924" Apr 21 07:15:30.207992 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:30.207928 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gv622_98939484-0f22-46f7-9460-702c1eb19754/dns/0.log" Apr 21 07:15:30.233747 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:30.233728 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gv622_98939484-0f22-46f7-9460-702c1eb19754/kube-rbac-proxy/0.log" Apr 21 07:15:30.339781 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:30.339755 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nr6p6_c5aed744-8c66-4da8-b412-288b462f285b/dns-node-resolver/0.log" Apr 21 07:15:30.850294 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:30.850249 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v9k64_4b301c7f-1c6e-4bf8-ba19-c3a7fd175d66/node-ca/0.log" Apr 21 07:15:31.860154 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:31.860124 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6qpz8_1ca15324-b979-4a39-9c0a-defe74d51dd0/serve-healthcheck-canary/0.log" Apr 21 07:15:32.292071 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:32.292045 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g8hn6_0899e970-711f-4417-b5e5-e887c988472c/kube-rbac-proxy/0.log" Apr 21 07:15:32.315407 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:32.315384 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g8hn6_0899e970-711f-4417-b5e5-e887c988472c/exporter/0.log" Apr 21 07:15:32.338943 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:32.338922 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g8hn6_0899e970-711f-4417-b5e5-e887c988472c/extractor/0.log" Apr 21 07:15:35.858794 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:35.858763 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-t5hmv" Apr 21 07:15:36.191900 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:36.191831 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5k65r_17117959-58a1-463d-89ab-64b5b61b1443/migrator/0.log" Apr 21 07:15:36.220357 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:36.220330 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5k65r_17117959-58a1-463d-89ab-64b5b61b1443/graceful-termination/0.log" Apr 21 07:15:37.779937 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:37.779911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vzbdr_e12015e6-2082-4f37-be78-ba178fd7beec/kube-multus-additional-cni-plugins/0.log" Apr 21 07:15:37.806249 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:37.806227 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vzbdr_e12015e6-2082-4f37-be78-ba178fd7beec/egress-router-binary-copy/0.log" Apr 21 07:15:37.830520 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:37.830496 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vzbdr_e12015e6-2082-4f37-be78-ba178fd7beec/cni-plugins/0.log" Apr 21 07:15:37.859618 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:37.859598 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vzbdr_e12015e6-2082-4f37-be78-ba178fd7beec/bond-cni-plugin/0.log" Apr 21 07:15:37.884143 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:37.884126 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vzbdr_e12015e6-2082-4f37-be78-ba178fd7beec/routeoverride-cni/0.log" Apr 21 07:15:37.912669 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:37.912629 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vzbdr_e12015e6-2082-4f37-be78-ba178fd7beec/whereabouts-cni-bincopy/0.log" Apr 21 07:15:37.939661 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:37.939639 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vzbdr_e12015e6-2082-4f37-be78-ba178fd7beec/whereabouts-cni/0.log" Apr 21 07:15:38.055342 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:38.055322 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r8m6x_dec13d58-9abe-4cbd-a479-45ceea3970a9/kube-multus/0.log" Apr 21 07:15:38.144153 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:38.144130 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fqxn4_61710589-be37-470a-8046-39c730b38313/network-metrics-daemon/0.log" Apr 21 07:15:38.165981 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:38.165963 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fqxn4_61710589-be37-470a-8046-39c730b38313/kube-rbac-proxy/0.log" Apr 21 07:15:38.940385 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:38.940305 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/ovn-controller/0.log" Apr 21 07:15:38.961422 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:38.961396 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/ovn-acl-logging/0.log" Apr 21 07:15:38.962581 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:38.962566 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/ovn-acl-logging/1.log" Apr 21 07:15:38.980845 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:38.980828 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/kube-rbac-proxy-node/0.log" Apr 21 07:15:39.002921 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:39.002903 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 07:15:39.030312 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:39.030252 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/northd/0.log" Apr 21 07:15:39.052010 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:39.051993 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/nbdb/0.log" Apr 21 07:15:39.073991 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:39.073972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/sbdb/0.log" Apr 21 07:15:39.154464 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:39.154438 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hmsxs_e19ba7b9-b9ea-4177-a4b5-9fd6f16f010a/ovnkube-controller/0.log" Apr 21 07:15:40.806837 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:40.806810 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zv2g2_53e6ffe6-1b54-4a2a-8aa1-0a1d310df973/network-check-target-container/0.log" Apr 21 07:15:41.727208 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:41.727179 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lccf5_b7def839-f2b6-4ceb-9338-57bbb74327a3/iptables-alerter/0.log" Apr 21 07:15:42.353934 ip-10-0-139-104 kubenswrapper[2576]: I0421 07:15:42.353905 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8w45p_13d3ab13-d431-4545-bc03-50c6840b6f39/tuned/0.log"