Apr 16 17:38:41.572094 ip-10-0-140-62 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 17:38:41.572104 ip-10-0-140-62 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 17:38:41.572111 ip-10-0-140-62 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 17:38:41.572333 ip-10-0-140-62 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 17:38:51.638076 ip-10-0-140-62 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 17:38:51.638093 ip-10-0-140-62 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3b360dbdb54f4c9280c0c7b1936b63fe -- Apr 16 17:41:24.792819 ip-10-0-140-62 systemd[1]: Starting Kubernetes Kubelet... Apr 16 17:41:25.241718 ip-10-0-140-62 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:41:25.241718 ip-10-0-140-62 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 17:41:25.241718 ip-10-0-140-62 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:41:25.241718 ip-10-0-140-62 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 17:41:25.241718 ip-10-0-140-62 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:41:25.244768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.244703 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 17:41:25.246993 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.246980 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:25.246993 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.246993 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.246999 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247002 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247005 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247007 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247010 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247012 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247015 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247018 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247021 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247023 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247030 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247033 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247036 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247039 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247041 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247044 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247047 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247050 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247052 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:25.247053 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247055 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247058 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247061 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247064 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247066 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247069 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247072 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247074 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247077 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247079 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247082 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247084 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247087 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247089 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247092 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247094 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247097 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247099 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247102 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247104 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:25.247584 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247107 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247109 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247111 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247114 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247117 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247120 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247122 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247124 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247127 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247130 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247133 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247135 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247138 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247140 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247144 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247147 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247150 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247152 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247156 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247158 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:25.248066 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247161 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247165 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247169 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247172 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247174 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247177 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247179 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247184 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247188 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247191 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247194 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247196 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247199 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247202 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247204 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247207 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247210 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247212 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247215 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:25.248566 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247217 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247220 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247222 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247225 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247227 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247230 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247613 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247620 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247623 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247626 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247629 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247639 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247642 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247645 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247647 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247650 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247653 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247656 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247658 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:25.249028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247661 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247664 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247666 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247669 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247673 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247675 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247677 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247680 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247683 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247685 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247688 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247692 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247696 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247699 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247701 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247705 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247708 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247711 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247714 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:25.249580 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247717 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247721 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247724 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247727 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247729 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247732 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247736 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247738 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247741 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247744 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247747 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247750 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247753 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247755 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247758 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247760 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247763 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247766 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247768 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247770 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:25.250085 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247773 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247775 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247778 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247781 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247783 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247786 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247788 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247791 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247794 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247796 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247799 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247801 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247804 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247807 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247809 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247811 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247815 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247818 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247820 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247823 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:25.250582 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247825 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247829 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247831 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247834 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247836 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247838 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247841 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247844 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247846 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247849 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247851 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247853 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247856 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.247858 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248585 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248593 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248602 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248607 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248613 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248616 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248620 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 17:41:25.251061 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248624 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248627 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248630 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248634 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248637 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248640 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248643 2570 flags.go:64] FLAG: --cgroup-root="" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248646 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248649 2570 flags.go:64] FLAG: --client-ca-file="" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248652 2570 flags.go:64] FLAG: --cloud-config="" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248655 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248658 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248674 2570 flags.go:64] FLAG: --cluster-domain="" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248676 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248680 2570 flags.go:64] FLAG: --config-dir="" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248682 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248686 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248693 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248696 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248699 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248702 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248705 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248708 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248711 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248714 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 17:41:25.251589 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248717 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248721 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248724 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248727 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248730 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248733 2570 flags.go:64] FLAG: --enable-server="true" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248736 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248743 2570 flags.go:64] FLAG: --event-burst="100" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248746 2570 flags.go:64] FLAG: --event-qps="50" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248749 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248752 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248755 2570 flags.go:64] FLAG: --eviction-hard="" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248760 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248763 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248766 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248769 2570 flags.go:64] FLAG: --eviction-soft="" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248772 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248775 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248778 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248786 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248789 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248792 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248795 2570 flags.go:64] FLAG: --feature-gates="" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248799 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248802 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 17:41:25.252196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248805 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248808 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248811 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248814 2570 flags.go:64] FLAG: --help="false" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248816 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248819 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248822 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248825 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248828 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248831 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248834 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248838 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248841 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248844 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248847 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248850 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248853 2570 flags.go:64] FLAG: --kube-reserved="" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248856 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248859 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248862 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248865 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248868 2570 flags.go:64] FLAG: --lock-file="" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248871 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248873 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 17:41:25.252817 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248876 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248881 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248884 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248893 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248896 2570 flags.go:64] FLAG: --logging-format="text" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248899 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248902 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248905 2570 flags.go:64] FLAG: --manifest-url="" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248908 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248912 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248915 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248919 2570 flags.go:64] FLAG: --max-pods="110" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248922 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248925 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248928 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248930 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248933 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248936 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248939 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248947 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248950 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248953 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248956 2570 flags.go:64] FLAG: --pod-cidr="" Apr 16 17:41:25.253405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248959 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248965 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248968 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248972 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248974 2570 flags.go:64] FLAG: --port="10250" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248977 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248980 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08d3248efc8ad439c" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248983 2570 flags.go:64] FLAG: --qos-reserved="" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248986 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248989 2570 flags.go:64] FLAG: --register-node="true" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248992 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248995 2570 flags.go:64] FLAG: --register-with-taints="" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.248998 2570 flags.go:64] FLAG: --registry-burst="10" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249007 2570 flags.go:64] FLAG: --registry-qps="5" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249010 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249012 2570 flags.go:64] FLAG: --reserved-memory="" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249016 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249019 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249022 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249037 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249041 2570 flags.go:64] FLAG: --runonce="false" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249045 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249048 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249051 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249054 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249057 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 17:41:25.253972 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249060 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249063 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249066 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249070 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249073 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249076 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249078 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249082 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249085 2570 flags.go:64] FLAG: --system-cgroups="" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249088 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249094 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249096 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249099 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249106 2570 flags.go:64] FLAG: --tls-min-version="" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249109 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249111 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249114 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249117 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249120 2570 flags.go:64] FLAG: --v="2" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249124 2570 flags.go:64] FLAG: --version="false" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249133 2570 flags.go:64] FLAG: --vmodule="" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249137 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.249141 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249248 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249252 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:25.254606 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249256 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249259 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249262 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249264 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249267 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249276 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249280 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249283 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249288 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249292 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249297 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249300 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249303 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249305 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249309 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249312 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249315 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249317 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249320 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:25.255204 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249322 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249325 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249328 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249330 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249333 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249335 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249338 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249340 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249343 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249346 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249349 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249351 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249354 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249356 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249359 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249361 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249363 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249366 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249368 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249371 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:25.255736 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249373 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249376 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249378 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249383 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249386 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249388 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249391 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249395 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249398 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249400 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249403 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249406 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249408 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249411 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249413 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249416 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249418 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249421 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249424 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249426 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:25.256470 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249429 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249431 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249434 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249437 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249439 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249442 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249444 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249447 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249450 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249452 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249455 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249458 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249460 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249463 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249465 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249469 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249472 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249474 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249477 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249484 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:25.257076 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249487 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249489 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249492 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249494 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.249497 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.250215 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.257238 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.257256 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257342 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257350 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257355 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257360 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257365 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257369 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257375 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:25.257930 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257379 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257383 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257387 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257392 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257396 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257400 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257404 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257409 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257413 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257417 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257422 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257426 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257430 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257434 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257438 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257442 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257446 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257450 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257454 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:25.258551 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257459 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257463 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257467 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257478 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257485 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257491 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257496 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257501 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257523 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257528 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257532 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257537 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257541 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257545 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257549 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257554 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257558 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257563 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257566 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:25.259024 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257571 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257575 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257579 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257583 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257587 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257592 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257596 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257600 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257604 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257608 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257613 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257617 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257622 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257626 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257630 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257636 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257642 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257646 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257651 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:25.259578 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257655 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257660 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257664 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257668 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257673 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257677 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257680 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257684 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257689 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257693 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257697 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257701 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257707 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257711 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257716 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257720 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257724 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257728 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257732 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257737 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:25.260319 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257741 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257745 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.257753 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257903 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257911 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257915 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257920 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257924 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257928 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257933 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257937 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257942 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257946 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257950 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257954 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257959 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:25.261181 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257963 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257967 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257971 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257975 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257980 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257984 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257988 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257992 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.257997 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258006 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258012 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258017 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258022 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258026 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258030 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258034 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258038 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258042 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258046 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:25.261671 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258051 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258055 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258059 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258063 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258067 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258071 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258075 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258079 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258084 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258088 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258093 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258097 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258101 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258105 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258109 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258113 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258117 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258121 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258125 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258129 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:25.262255 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258133 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258137 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258141 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258146 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258150 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258155 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258159 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258163 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258167 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258171 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258175 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258179 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258183 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258187 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258191 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258195 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258199 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258204 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258208 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258212 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:25.262776 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258216 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258220 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258224 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258229 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258233 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258237 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258240 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258245 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258249 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258252 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258256 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258260 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258264 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:25.258268 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.258276 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:41:25.263291 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.259054 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 17:41:25.263699 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.261837 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 17:41:25.263699 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.262773 2570 server.go:1019] "Starting client certificate rotation" Apr 16 17:41:25.263699 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.262866 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:41:25.263699 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.262906 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:41:25.286665 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.286642 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:41:25.290705 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.290686 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:41:25.308176 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.308160 2570 log.go:25] "Validated CRI v1 runtime API" Apr 16 17:41:25.313721 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.313708 2570 log.go:25] "Validated CRI v1 image API" Apr 16 17:41:25.314976 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.314962 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 17:41:25.316767 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.316752 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:41:25.317189 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.317174 2570 fs.go:135] Filesystem UUIDs: map[0da27801-105e-4c59-a6da-ce51fc3f2019:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 7ed3ec3b-9c1f-4794-b6ef-1845d8034ee6:/dev/nvme0n1p3] Apr 16 17:41:25.317228 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.317190 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 17:41:25.322848 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.322737 2570 manager.go:217] Machine: {Timestamp:2026-04-16 17:41:25.321653723 +0000 UTC m=+0.401874862 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3105480 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e4c568eccf0f9a68e9bef3557d865 SystemUUID:ec2e4c56-8ecc-f0f9-a68e-9bef3557d865 BootID:3b360dbd-b54f-4c92-80c0-c7b1936b63fe Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:63:ee:88:c8:41 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:63:ee:88:c8:41 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:77:e4:5e:68:46 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 17:41:25.322848 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.322835 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 17:41:25.322977 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.322899 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 17:41:25.323737 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.323717 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 17:41:25.323851 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.323738 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-62.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 17:41:25.323896 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.323859 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 17:41:25.323896 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.323868 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 17:41:25.323896 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.323880 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:41:25.323896 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.323889 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:41:25.325620 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.325609 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:41:25.325716 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.325707 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 17:41:25.328032 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.328023 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 16 17:41:25.328072 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.328035 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 17:41:25.328072 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.328046 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 17:41:25.328072 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.328055 2570 kubelet.go:397] "Adding apiserver pod source" Apr 16 17:41:25.328072 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.328062 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 17:41:25.329006 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.328995 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:41:25.329056 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.329012 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:41:25.331563 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.331549 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 17:41:25.333074 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.333061 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 17:41:25.334373 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334360 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 17:41:25.334428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334378 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 17:41:25.334428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334386 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 17:41:25.334428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334395 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 17:41:25.334428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334403 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 17:41:25.334428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334409 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 17:41:25.334428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334414 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 17:41:25.334428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334419 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 17:41:25.334428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334426 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 17:41:25.334428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334433 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 17:41:25.334700 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334447 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 17:41:25.334700 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.334455 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 17:41:25.335170 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.335160 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 17:41:25.335170 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.335170 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 17:41:25.338003 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.337972 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-62.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 17:41:25.338082 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.338017 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 17:41:25.338082 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.338024 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-62.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 17:41:25.338628 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.338615 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 17:41:25.338705 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.338655 2570 server.go:1295] "Started kubelet" Apr 16 17:41:25.338758 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.338715 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 17:41:25.338852 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.338806 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 17:41:25.338892 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.338873 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 17:41:25.339347 ip-10-0-140-62 systemd[1]: Started Kubernetes Kubelet. Apr 16 17:41:25.339864 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.339844 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 17:41:25.341279 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.341256 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 16 17:41:25.341917 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.341900 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-79q5q" Apr 16 17:41:25.346021 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.345992 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 17:41:25.346476 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.346458 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 17:41:25.346735 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.345804 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-62.ec2.internal.18a6e72d6d5d7f19 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-62.ec2.internal,UID:ip-10-0-140-62.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-62.ec2.internal,},FirstTimestamp:2026-04-16 17:41:25.338627865 +0000 UTC m=+0.418848988,LastTimestamp:2026-04-16 17:41:25.338627865 +0000 UTC m=+0.418848988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-62.ec2.internal,}" Apr 16 17:41:25.346986 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.346962 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 17:41:25.347332 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.347318 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 17:41:25.347414 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.347320 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 17:41:25.347414 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.347348 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 17:41:25.347488 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.347445 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 16 17:41:25.347488 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.347452 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 16 17:41:25.347572 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.347542 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:25.347778 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.347766 2570 factory.go:55] Registering systemd factory Apr 16 17:41:25.347808 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.347783 2570 factory.go:223] Registration of the systemd container factory successfully Apr 16 17:41:25.348599 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.348576 2570 factory.go:153] Registering CRI-O factory Apr 16 17:41:25.348599 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.348593 2570 factory.go:223] Registration of the crio container factory successfully Apr 16 17:41:25.348728 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.348639 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 17:41:25.348728 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.348653 2570 factory.go:103] Registering Raw factory Apr 16 17:41:25.348728 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.348666 2570 manager.go:1196] Started watching for new ooms in manager Apr 16 17:41:25.349123 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.349112 2570 manager.go:319] Starting recovery of all containers Apr 16 17:41:25.351181 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.351157 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-79q5q" Apr 16 17:41:25.353013 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.352983 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-62.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 17:41:25.353100 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.353078 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 17:41:25.361577 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.361423 2570 manager.go:324] Recovery completed Apr 16 17:41:25.365625 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.365612 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:25.367854 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.367827 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:25.367913 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.367863 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:25.367913 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.367876 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:25.368352 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.368336 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 17:41:25.368352 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.368351 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 17:41:25.368427 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.368365 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:41:25.369265 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.369203 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-62.ec2.internal.18a6e72d6f1b524a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-62.ec2.internal,UID:ip-10-0-140-62.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-62.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-62.ec2.internal,},FirstTimestamp:2026-04-16 17:41:25.36784545 +0000 UTC m=+0.448066577,LastTimestamp:2026-04-16 17:41:25.36784545 +0000 UTC m=+0.448066577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-62.ec2.internal,}" Apr 16 17:41:25.370692 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.370680 2570 policy_none.go:49] "None policy: Start" Apr 16 17:41:25.370730 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.370696 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 17:41:25.370730 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.370706 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 16 17:41:25.418092 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.418079 2570 manager.go:341] "Starting Device Plugin manager" Apr 16 17:41:25.429292 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.418115 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 17:41:25.429292 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.418128 2570 server.go:85] "Starting device plugin registration server" Apr 16 17:41:25.429292 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.418339 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 17:41:25.429292 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.418350 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 17:41:25.429292 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.418422 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 17:41:25.429292 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.418522 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 17:41:25.429292 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.418535 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 17:41:25.429292 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.418963 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 17:41:25.429292 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.419000 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:25.519218 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.519175 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:25.520857 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.520841 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:25.520924 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.520868 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:25.520924 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.520880 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:25.520924 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.520913 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.522415 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.522385 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 17:41:25.523568 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.523546 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 17:41:25.523568 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.523570 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 17:41:25.523678 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.523585 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 17:41:25.523678 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.523593 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 17:41:25.523678 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.523621 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 17:41:25.526872 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.526856 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:25.537546 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.537533 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.537626 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.537551 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-62.ec2.internal\": node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:25.593393 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.593374 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:25.624489 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.624469 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal"] Apr 16 17:41:25.624580 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.624553 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:25.625815 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.625800 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:25.625906 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.625819 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:25.625906 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.625829 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:25.627158 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.627146 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:25.627284 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.627258 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.627323 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.627290 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:25.627830 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.627815 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:25.627830 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.627825 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:25.627931 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.627847 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:25.627931 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.627860 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:25.627931 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.627848 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:25.627931 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.627933 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:25.629486 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.629470 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.629576 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.629499 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:25.630154 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.630142 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:25.630241 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.630163 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:25.630241 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.630175 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:25.648048 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.648030 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-62.ec2.internal\" not found" node="ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.648735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.648719 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f2638ab93f1463c33d45992d535fdc15-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal\" (UID: \"f2638ab93f1463c33d45992d535fdc15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.648811 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.648756 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2638ab93f1463c33d45992d535fdc15-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal\" (UID: \"f2638ab93f1463c33d45992d535fdc15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.648811 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.648779 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/045b24ad5fe56c3ffbf4f39e4a48e404-config\") pod \"kube-apiserver-proxy-ip-10-0-140-62.ec2.internal\" (UID: \"045b24ad5fe56c3ffbf4f39e4a48e404\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.652390 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.652377 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-62.ec2.internal\" not found" node="ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.694398 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.694384 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:25.749167 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.749140 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/045b24ad5fe56c3ffbf4f39e4a48e404-config\") pod \"kube-apiserver-proxy-ip-10-0-140-62.ec2.internal\" (UID: \"045b24ad5fe56c3ffbf4f39e4a48e404\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.749263 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.749180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f2638ab93f1463c33d45992d535fdc15-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal\" (UID: \"f2638ab93f1463c33d45992d535fdc15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.749263 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.749197 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2638ab93f1463c33d45992d535fdc15-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal\" (UID: \"f2638ab93f1463c33d45992d535fdc15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.749263 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.749220 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2638ab93f1463c33d45992d535fdc15-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal\" (UID: \"f2638ab93f1463c33d45992d535fdc15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.749263 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.749238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/045b24ad5fe56c3ffbf4f39e4a48e404-config\") pod \"kube-apiserver-proxy-ip-10-0-140-62.ec2.internal\" (UID: \"045b24ad5fe56c3ffbf4f39e4a48e404\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.749263 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.749242 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f2638ab93f1463c33d45992d535fdc15-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal\" (UID: \"f2638ab93f1463c33d45992d535fdc15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.795425 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.795388 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:25.895973 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.895950 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:25.951399 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.951385 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.956151 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:25.956134 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal" Apr 16 17:41:25.996902 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:25.996882 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:26.097408 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:26.097351 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:26.181245 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.181225 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:26.198261 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:26.198244 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:26.262791 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.262771 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 17:41:26.263124 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.262869 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:41:26.263124 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.262922 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:41:26.299081 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:26.299058 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:26.346197 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.346181 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 17:41:26.357099 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.357030 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:36:25 +0000 UTC" deadline="2028-01-14 16:37:34.895388864 +0000 UTC" Apr 16 17:41:26.357099 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.357072 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15310h56m8.538319609s" Apr 16 17:41:26.359786 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.359768 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:41:26.370409 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:26.370384 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2638ab93f1463c33d45992d535fdc15.slice/crio-6b16ef2adee19f2e76b6f174b9c92382450d856e04584236eb6077b61a8363a8 WatchSource:0}: Error finding container 6b16ef2adee19f2e76b6f174b9c92382450d856e04584236eb6077b61a8363a8: Status 404 returned error can't find the container with id 6b16ef2adee19f2e76b6f174b9c92382450d856e04584236eb6077b61a8363a8 Apr 16 17:41:26.375619 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.375606 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:41:26.380691 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:26.380673 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod045b24ad5fe56c3ffbf4f39e4a48e404.slice/crio-aed74e67c80cb83fa3ebf2be880d05e1197983b88126dca6dd298991be8d4d3f WatchSource:0}: Error finding container aed74e67c80cb83fa3ebf2be880d05e1197983b88126dca6dd298991be8d4d3f: Status 404 returned error can't find the container with id aed74e67c80cb83fa3ebf2be880d05e1197983b88126dca6dd298991be8d4d3f Apr 16 17:41:26.389551 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.389532 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bxswv" Apr 16 17:41:26.396862 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.396845 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bxswv" Apr 16 17:41:26.399340 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:26.399324 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:26.408492 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.408472 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:26.500294 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:26.500270 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:26.525990 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.525947 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal" event={"ID":"045b24ad5fe56c3ffbf4f39e4a48e404","Type":"ContainerStarted","Data":"aed74e67c80cb83fa3ebf2be880d05e1197983b88126dca6dd298991be8d4d3f"} Apr 16 17:41:26.526836 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.526815 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" event={"ID":"f2638ab93f1463c33d45992d535fdc15","Type":"ContainerStarted","Data":"6b16ef2adee19f2e76b6f174b9c92382450d856e04584236eb6077b61a8363a8"} Apr 16 17:41:26.601360 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:26.601335 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-62.ec2.internal\" not found" Apr 16 17:41:26.630317 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.630279 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:26.647778 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.647757 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal" Apr 16 17:41:26.659717 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.659698 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:41:26.661898 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.661887 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" Apr 16 17:41:26.671017 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:26.671004 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:41:27.330261 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.330233 2570 apiserver.go:52] "Watching apiserver" Apr 16 17:41:27.341717 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.341681 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 17:41:27.342968 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.342941 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d9mxm","openshift-cluster-node-tuning-operator/tuned-gjbj8","openshift-dns/node-resolver-j272p","openshift-image-registry/node-ca-x6c67","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal","openshift-multus/multus-additional-cni-plugins-jdzwz","openshift-multus/multus-n7mp6","openshift-multus/network-metrics-daemon-l7h7z","openshift-network-diagnostics/network-check-target-5rkjf","kube-system/konnectivity-agent-tl2gd","kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9","openshift-network-operator/iptables-alerter-7lffp"] Apr 16 17:41:27.344391 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.344364 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.345353 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.345334 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.346598 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.346579 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.347111 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.347077 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 17:41:27.347355 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.347338 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 17:41:27.347715 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.347477 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mjst5\"" Apr 16 17:41:27.347715 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.347581 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 17:41:27.347715 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.347620 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 17:41:27.347910 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.347875 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.348390 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.348368 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:41:27.348463 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.348395 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lhz44\"" Apr 16 17:41:27.348539 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.348491 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 17:41:27.349041 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.349004 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qmr25\"" Apr 16 17:41:27.349041 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.349024 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 17:41:27.349443 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.349425 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.349930 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.349913 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 17:41:27.350199 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.350176 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 17:41:27.350283 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.350213 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 17:41:27.350343 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.350310 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q52x8\"" Apr 16 17:41:27.350395 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.350358 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 17:41:27.350753 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.350733 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.351760 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.351706 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dwljk\"" Apr 16 17:41:27.351848 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.351824 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 17:41:27.352496 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.351958 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 17:41:27.352496 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.352153 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:27.352496 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:27.352214 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:27.353700 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.353681 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 17:41:27.353788 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.353733 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 17:41:27.353788 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.353761 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:27.353943 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:27.353816 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:27.355499 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.354634 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 17:41:27.355499 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.354652 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 17:41:27.355499 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.354657 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kkmhp\"" Apr 16 17:41:27.355499 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.354669 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 17:41:27.355499 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.354699 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 17:41:27.357169 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-system-cni-dir\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.357252 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357191 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-var-lib-cni-bin\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.357252 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-kubernetes\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.357358 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357257 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25d3965d-906a-4df5-bec1-9edc3c2a8a64-tmp-dir\") pod \"node-resolver-j272p\" (UID: \"25d3965d-906a-4df5-bec1-9edc3c2a8a64\") " pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.357358 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357294 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rql6q\" (UniqueName: \"kubernetes.io/projected/aae5a78d-4165-4422-a612-17627616235f-kube-api-access-rql6q\") pod \"node-ca-x6c67\" (UID: \"aae5a78d-4165-4422-a612-17627616235f\") " pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.357358 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-cni-dir\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.357488 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357389 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-socket-dir-parent\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.357488 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357426 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-modprobe-d\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.357488 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-sysctl-conf\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.357640 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357487 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k26d9\" (UniqueName: \"kubernetes.io/projected/fb67ec76-fe28-444f-b4f0-51430f30c713-kube-api-access-k26d9\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.357640 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357536 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-run\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.357640 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357598 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-run-k8s-cni-cncf-io\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.357773 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357643 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-var-lib-kubelet\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.357773 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357677 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-hostroot\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.357773 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357747 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-sysconfig\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.357906 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357793 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-sys\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.357906 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357821 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a25f350-b652-47af-8404-87e373883218-etc-tuned\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.357988 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.357920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-conf-dir\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.358032 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-sysctl-d\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.358146 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358056 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-os-release\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.358200 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358161 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb67ec76-fe28-444f-b4f0-51430f30c713-cni-binary-copy\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.358249 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358219 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-var-lib-cni-multus\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.358304 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358288 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-var-lib-kubelet\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.358353 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358326 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a25f350-b652-47af-8404-87e373883218-tmp\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.358715 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358695 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tffxg\" (UniqueName: \"kubernetes.io/projected/25d3965d-906a-4df5-bec1-9edc3c2a8a64-kube-api-access-tffxg\") pod \"node-resolver-j272p\" (UID: \"25d3965d-906a-4df5-bec1-9edc3c2a8a64\") " pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.358798 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358733 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aae5a78d-4165-4422-a612-17627616235f-serviceca\") pod \"node-ca-x6c67\" (UID: \"aae5a78d-4165-4422-a612-17627616235f\") " pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.358798 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358758 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-cnibin\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.358798 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-run-netns\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.358946 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358802 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-etc-kubernetes\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.358946 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358824 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-systemd\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.358946 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358846 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-lib-modules\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.358946 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358866 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25d3965d-906a-4df5-bec1-9edc3c2a8a64-hosts-file\") pod \"node-resolver-j272p\" (UID: \"25d3965d-906a-4df5-bec1-9edc3c2a8a64\") " pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.358946 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358871 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:27.358946 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358890 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aae5a78d-4165-4422-a612-17627616235f-host\") pod \"node-ca-x6c67\" (UID: \"aae5a78d-4165-4422-a612-17627616235f\") " pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.358946 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-daemon-config\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.359290 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358951 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-run-multus-certs\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.359290 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358971 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-host\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.359290 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.358995 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfvll\" (UniqueName: \"kubernetes.io/projected/9a25f350-b652-47af-8404-87e373883218-kube-api-access-vfvll\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.361135 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.361115 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 17:41:27.361298 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.361280 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-c4mbn\"" Apr 16 17:41:27.361388 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.361372 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 17:41:27.361451 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.361433 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.363700 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.363683 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.363953 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.363931 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 17:41:27.364153 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.364121 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4m88m\"" Apr 16 17:41:27.364234 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.364218 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 17:41:27.365181 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.365164 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 17:41:27.366276 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.366218 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:41:27.366377 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.366293 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 17:41:27.366377 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.366343 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-q5gxj\"" Apr 16 17:41:27.366485 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.366386 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 17:41:27.397403 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.397379 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:36:26 +0000 UTC" deadline="2027-11-06 05:01:27.812766667 +0000 UTC" Apr 16 17:41:27.397403 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.397403 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13643h20m0.415367277s" Apr 16 17:41:27.448653 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.448633 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 17:41:27.459639 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459622 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-conf-dir\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.459729 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459648 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-slash\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.459729 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459669 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgf65\" (UniqueName: \"kubernetes.io/projected/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-kube-api-access-wgf65\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:27.459729 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459688 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8e15b302-e2d8-4a43-85d6-2c1a3bb9b319-agent-certs\") pod \"konnectivity-agent-tl2gd\" (UID: \"8e15b302-e2d8-4a43-85d6-2c1a3bb9b319\") " pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:27.459856 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459734 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb67ec76-fe28-444f-b4f0-51430f30c713-cni-binary-copy\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.459856 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459747 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-conf-dir\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.459856 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459772 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-var-lib-kubelet\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.459856 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a25f350-b652-47af-8404-87e373883218-tmp\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.459856 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459815 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-var-lib-kubelet\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.459856 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tffxg\" (UniqueName: \"kubernetes.io/projected/25d3965d-906a-4df5-bec1-9edc3c2a8a64-kube-api-access-tffxg\") pod \"node-resolver-j272p\" (UID: \"25d3965d-906a-4df5-bec1-9edc3c2a8a64\") " pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.459856 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aae5a78d-4165-4422-a612-17627616235f-serviceca\") pod \"node-ca-x6c67\" (UID: \"aae5a78d-4165-4422-a612-17627616235f\") " pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.460113 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-ovn-node-metrics-cert\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.460113 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459884 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.460113 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459911 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-cnibin\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.460113 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-run-netns\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.460113 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.459984 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-run-netns\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.460113 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-cnibin\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.460308 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-systemd\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.460308 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460154 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 17:41:27.460308 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460182 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-lib-modules\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.460308 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25d3965d-906a-4df5-bec1-9edc3c2a8a64-hosts-file\") pod \"node-resolver-j272p\" (UID: \"25d3965d-906a-4df5-bec1-9edc3c2a8a64\") " pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.460308 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aae5a78d-4165-4422-a612-17627616235f-host\") pod \"node-ca-x6c67\" (UID: \"aae5a78d-4165-4422-a612-17627616235f\") " pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.460308 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460289 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-systemd\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.460481 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460300 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aae5a78d-4165-4422-a612-17627616235f-serviceca\") pod \"node-ca-x6c67\" (UID: \"aae5a78d-4165-4422-a612-17627616235f\") " pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.460481 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460297 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-log-socket\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.460481 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460350 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.460481 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460354 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25d3965d-906a-4df5-bec1-9edc3c2a8a64-hosts-file\") pod \"node-resolver-j272p\" (UID: \"25d3965d-906a-4df5-bec1-9edc3c2a8a64\") " pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.460481 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460390 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb67ec76-fe28-444f-b4f0-51430f30c713-cni-binary-copy\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.460481 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460402 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aae5a78d-4165-4422-a612-17627616235f-host\") pod \"node-ca-x6c67\" (UID: \"aae5a78d-4165-4422-a612-17627616235f\") " pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.460481 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-daemon-config\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.460481 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-host\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.460866 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460499 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-host\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.460866 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-lib-modules\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.460866 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-etc-openvswitch\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.460866 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460587 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.461068 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.460632 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.461123 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461082 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr25p\" (UniqueName: \"kubernetes.io/projected/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-kube-api-access-mr25p\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.461178 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-var-lib-cni-bin\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.461234 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-kubernetes\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.461234 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461211 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rql6q\" (UniqueName: \"kubernetes.io/projected/aae5a78d-4165-4422-a612-17627616235f-kube-api-access-rql6q\") pod \"node-ca-x6c67\" (UID: \"aae5a78d-4165-4422-a612-17627616235f\") " pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.461234 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461203 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-daemon-config\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.461388 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461247 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-env-overrides\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.461388 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461307 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-var-lib-cni-bin\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.461388 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461317 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-kubernetes\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.461388 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0321b210-b7f5-4cfe-9d6d-318f4ba3d299-iptables-alerter-script\") pod \"iptables-alerter-7lffp\" (UID: \"0321b210-b7f5-4cfe-9d6d-318f4ba3d299\") " pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.461592 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461398 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-system-cni-dir\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.461592 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77hw2\" (UniqueName: \"kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2\") pod \"network-check-target-5rkjf\" (UID: \"26163ff9-2d96-4401-962b-735123e76554\") " pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:27.461592 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-socket-dir-parent\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.461745 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461631 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-var-lib-openvswitch\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.461800 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461731 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-kubelet\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.461800 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-cni-bin\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.461902 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461822 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg955\" (UniqueName: \"kubernetes.io/projected/4e7d951d-c7c5-444a-93c5-26faa5766be6-kube-api-access-lg955\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.461902 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461862 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtjg\" (UniqueName: \"kubernetes.io/projected/0321b210-b7f5-4cfe-9d6d-318f4ba3d299-kube-api-access-6gtjg\") pod \"iptables-alerter-7lffp\" (UID: \"0321b210-b7f5-4cfe-9d6d-318f4ba3d299\") " pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.462008 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461900 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8e15b302-e2d8-4a43-85d6-2c1a3bb9b319-konnectivity-ca\") pod \"konnectivity-agent-tl2gd\" (UID: \"8e15b302-e2d8-4a43-85d6-2c1a3bb9b319\") " pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:27.462008 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-run-k8s-cni-cncf-io\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.462008 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.461961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-hostroot\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.462159 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-sysconfig\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.462159 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462073 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a25f350-b652-47af-8404-87e373883218-etc-tuned\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.462159 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462107 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-run-ovn\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.462159 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462139 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-sysctl-d\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.462354 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462179 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-systemd-units\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.462354 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462226 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-node-log\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.462354 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-socket-dir-parent\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.462580 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-socket-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.462638 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462609 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-sysctl-d\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.462638 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462609 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-run-k8s-cni-cncf-io\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.462735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-os-release\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.462735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462671 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-hostroot\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.462735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462693 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-var-lib-cni-multus\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.462735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-sysconfig\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.462928 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462762 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-run-systemd\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.462928 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462814 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-etc-selinux\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.462928 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462869 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-etc-kubernetes\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.462928 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-run-netns\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.463119 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.462932 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:27.463119 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463080 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-registration-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.463119 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0321b210-b7f5-4cfe-9d6d-318f4ba3d299-host-slash\") pod \"iptables-alerter-7lffp\" (UID: \"0321b210-b7f5-4cfe-9d6d-318f4ba3d299\") " pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.463271 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463148 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-run-multus-certs\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.463271 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463212 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfvll\" (UniqueName: \"kubernetes.io/projected/9a25f350-b652-47af-8404-87e373883218-kube-api-access-vfvll\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.463271 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463244 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-ovnkube-script-lib\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.463422 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463281 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-device-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.463422 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463320 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-cnibin\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.463422 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-system-cni-dir\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.463591 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463415 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25d3965d-906a-4df5-bec1-9edc3c2a8a64-tmp-dir\") pod \"node-resolver-j272p\" (UID: \"25d3965d-906a-4df5-bec1-9edc3c2a8a64\") " pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.463591 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463477 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-sys-fs\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.463684 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a25f350-b652-47af-8404-87e373883218-tmp\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.463740 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-cni-dir\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.463740 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463757 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-modprobe-d\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.463867 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463760 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-var-lib-cni-multus\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.463867 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463787 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-sysctl-conf\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.463867 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-os-release\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.463867 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463866 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-run-openvswitch\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.464059 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-etc-kubernetes\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.464059 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463896 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.464059 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463929 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k26d9\" (UniqueName: \"kubernetes.io/projected/fb67ec76-fe28-444f-b4f0-51430f30c713-kube-api-access-k26d9\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.464059 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.463984 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-run-multus-certs\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.464059 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464053 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-sysctl-conf\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.464301 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464070 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-etc-modprobe-d\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.464301 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464101 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-system-cni-dir\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.464301 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464153 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-run\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.464301 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464190 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-cni-netd\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.464301 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464242 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpkd8\" (UniqueName: \"kubernetes.io/projected/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-kube-api-access-cpkd8\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.464301 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-os-release\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.464603 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464337 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.464603 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464367 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.464603 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464404 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-var-lib-kubelet\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.464603 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-sys\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.464603 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464475 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-ovnkube-config\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.464603 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-run\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.464884 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464672 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-host-var-lib-kubelet\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.464884 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464734 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a25f350-b652-47af-8404-87e373883218-sys\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.464994 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.464925 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25d3965d-906a-4df5-bec1-9edc3c2a8a64-tmp-dir\") pod \"node-resolver-j272p\" (UID: \"25d3965d-906a-4df5-bec1-9edc3c2a8a64\") " pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.465369 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.465089 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb67ec76-fe28-444f-b4f0-51430f30c713-multus-cni-dir\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.467005 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.466942 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a25f350-b652-47af-8404-87e373883218-etc-tuned\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.468106 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.468072 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:27.470459 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.470375 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rql6q\" (UniqueName: \"kubernetes.io/projected/aae5a78d-4165-4422-a612-17627616235f-kube-api-access-rql6q\") pod \"node-ca-x6c67\" (UID: \"aae5a78d-4165-4422-a612-17627616235f\") " pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.470459 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.470383 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tffxg\" (UniqueName: \"kubernetes.io/projected/25d3965d-906a-4df5-bec1-9edc3c2a8a64-kube-api-access-tffxg\") pod \"node-resolver-j272p\" (UID: \"25d3965d-906a-4df5-bec1-9edc3c2a8a64\") " pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.472231 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.472142 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfvll\" (UniqueName: \"kubernetes.io/projected/9a25f350-b652-47af-8404-87e373883218-kube-api-access-vfvll\") pod \"tuned-gjbj8\" (UID: \"9a25f350-b652-47af-8404-87e373883218\") " pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.472592 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.472552 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k26d9\" (UniqueName: \"kubernetes.io/projected/fb67ec76-fe28-444f-b4f0-51430f30c713-kube-api-access-k26d9\") pod \"multus-n7mp6\" (UID: \"fb67ec76-fe28-444f-b4f0-51430f30c713\") " pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.565411 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-slash\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.565576 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgf65\" (UniqueName: \"kubernetes.io/projected/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-kube-api-access-wgf65\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:27.565576 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8e15b302-e2d8-4a43-85d6-2c1a3bb9b319-agent-certs\") pod \"konnectivity-agent-tl2gd\" (UID: \"8e15b302-e2d8-4a43-85d6-2c1a3bb9b319\") " pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:27.565576 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-ovn-node-metrics-cert\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.565576 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.565576 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565515 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-slash\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.565576 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565532 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-log-socket\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.565576 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.565576 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-etc-openvswitch\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565612 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr25p\" (UniqueName: \"kubernetes.io/projected/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-kube-api-access-mr25p\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565656 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-env-overrides\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0321b210-b7f5-4cfe-9d6d-318f4ba3d299-iptables-alerter-script\") pod \"iptables-alerter-7lffp\" (UID: \"0321b210-b7f5-4cfe-9d6d-318f4ba3d299\") " pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565703 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-system-cni-dir\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77hw2\" (UniqueName: \"kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2\") pod \"network-check-target-5rkjf\" (UID: \"26163ff9-2d96-4401-962b-735123e76554\") " pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565719 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-etc-openvswitch\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-var-lib-openvswitch\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-kubelet\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565779 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-log-socket\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565783 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-cni-bin\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565822 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-cni-bin\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565834 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-system-cni-dir\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565872 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-kubelet\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565902 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg955\" (UniqueName: \"kubernetes.io/projected/4e7d951d-c7c5-444a-93c5-26faa5766be6-kube-api-access-lg955\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-var-lib-openvswitch\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565932 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtjg\" (UniqueName: \"kubernetes.io/projected/0321b210-b7f5-4cfe-9d6d-318f4ba3d299-kube-api-access-6gtjg\") pod \"iptables-alerter-7lffp\" (UID: \"0321b210-b7f5-4cfe-9d6d-318f4ba3d299\") " pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565959 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8e15b302-e2d8-4a43-85d6-2c1a3bb9b319-konnectivity-ca\") pod \"konnectivity-agent-tl2gd\" (UID: \"8e15b302-e2d8-4a43-85d6-2c1a3bb9b319\") " pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-run-ovn\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-systemd-units\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566054 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-node-log\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566076 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-socket-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-run-systemd\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-etc-selinux\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566151 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-run-netns\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566163 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-run-ovn\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566220 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-registration-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0321b210-b7f5-4cfe-9d6d-318f4ba3d299-host-slash\") pod \"iptables-alerter-7lffp\" (UID: \"0321b210-b7f5-4cfe-9d6d-318f4ba3d299\") " pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566272 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-ovnkube-script-lib\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566320 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-socket-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.566818 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0321b210-b7f5-4cfe-9d6d-318f4ba3d299-iptables-alerter-script\") pod \"iptables-alerter-7lffp\" (UID: \"0321b210-b7f5-4cfe-9d6d-318f4ba3d299\") " pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.565959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566378 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-systemd-units\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566373 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-run-systemd\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566407 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-device-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566419 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-node-log\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566310 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-device-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-cnibin\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566535 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-etc-selinux\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566580 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-cnibin\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566585 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-registration-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566541 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-sys-fs\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566536 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-env-overrides\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566639 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-sys-fs\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566640 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-run-netns\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-run-openvswitch\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566705 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:27.566715 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:27.567562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566731 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-cni-netd\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566755 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpkd8\" (UniqueName: \"kubernetes.io/projected/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-kube-api-access-cpkd8\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:27.566773 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs podName:38d21ef6-c2df-4bbd-8185-bf4fff5cb835 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:28.066752183 +0000 UTC m=+3.146973313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs") pod "network-metrics-daemon-l7h7z" (UID: "38d21ef6-c2df-4bbd-8185-bf4fff5cb835") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-os-release\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566826 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566852 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566881 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-ovnkube-config\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566949 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-run-openvswitch\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566960 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566964 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8e15b302-e2d8-4a43-85d6-2c1a3bb9b319-konnectivity-ca\") pod \"konnectivity-agent-tl2gd\" (UID: \"8e15b302-e2d8-4a43-85d6-2c1a3bb9b319\") " pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.566996 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e7d951d-c7c5-444a-93c5-26faa5766be6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.567009 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-ovnkube-script-lib\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.567051 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0321b210-b7f5-4cfe-9d6d-318f4ba3d299-host-slash\") pod \"iptables-alerter-7lffp\" (UID: \"0321b210-b7f5-4cfe-9d6d-318f4ba3d299\") " pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.567165 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-os-release\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.567012 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-host-cni-netd\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.567401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-ovnkube-config\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.568207 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.567492 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.568792 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.567523 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.568792 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.567526 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.568792 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.568103 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-ovn-node-metrics-cert\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.568792 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.568215 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8e15b302-e2d8-4a43-85d6-2c1a3bb9b319-agent-certs\") pod \"konnectivity-agent-tl2gd\" (UID: \"8e15b302-e2d8-4a43-85d6-2c1a3bb9b319\") " pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:27.571490 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:27.571472 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:27.571490 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:27.571492 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:27.571638 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:27.571520 2570 projected.go:194] Error preparing data for projected volume kube-api-access-77hw2 for pod openshift-network-diagnostics/network-check-target-5rkjf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:27.571638 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:27.571605 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2 podName:26163ff9-2d96-4401-962b-735123e76554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:28.071572881 +0000 UTC m=+3.151793995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-77hw2" (UniqueName: "kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2") pod "network-check-target-5rkjf" (UID: "26163ff9-2d96-4401-962b-735123e76554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:27.574154 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.574133 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg955\" (UniqueName: \"kubernetes.io/projected/4e7d951d-c7c5-444a-93c5-26faa5766be6-kube-api-access-lg955\") pod \"aws-ebs-csi-driver-node-sm5r9\" (UID: \"4e7d951d-c7c5-444a-93c5-26faa5766be6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.574256 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.574179 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgf65\" (UniqueName: \"kubernetes.io/projected/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-kube-api-access-wgf65\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:27.575131 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.575105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr25p\" (UniqueName: \"kubernetes.io/projected/29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8-kube-api-access-mr25p\") pod \"multus-additional-cni-plugins-jdzwz\" (UID: \"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8\") " pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.575700 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.575681 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtjg\" (UniqueName: \"kubernetes.io/projected/0321b210-b7f5-4cfe-9d6d-318f4ba3d299-kube-api-access-6gtjg\") pod \"iptables-alerter-7lffp\" (UID: \"0321b210-b7f5-4cfe-9d6d-318f4ba3d299\") " pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.575798 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.575747 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpkd8\" (UniqueName: \"kubernetes.io/projected/6f384fae-bee8-46b2-8fd3-71f7ece4b87e-kube-api-access-cpkd8\") pod \"ovnkube-node-d9mxm\" (UID: \"6f384fae-bee8-46b2-8fd3-71f7ece4b87e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.655958 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.655904 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n7mp6" Apr 16 17:41:27.663577 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.663553 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" Apr 16 17:41:27.671304 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.671285 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j272p" Apr 16 17:41:27.675750 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.675731 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x6c67" Apr 16 17:41:27.682728 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.682706 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" Apr 16 17:41:27.688361 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.688343 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:27.694872 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.694856 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:27.701450 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.701432 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" Apr 16 17:41:27.706961 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:27.706942 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7lffp" Apr 16 17:41:27.981557 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:27.981403 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d3965d_906a_4df5_bec1_9edc3c2a8a64.slice/crio-dae4c06640a0723bdd0ecf48a3d96ef8bdb18935be6a431d1e05330039d77b7a WatchSource:0}: Error finding container dae4c06640a0723bdd0ecf48a3d96ef8bdb18935be6a431d1e05330039d77b7a: Status 404 returned error can't find the container with id dae4c06640a0723bdd0ecf48a3d96ef8bdb18935be6a431d1e05330039d77b7a Apr 16 17:41:27.983046 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:27.983020 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0321b210_b7f5_4cfe_9d6d_318f4ba3d299.slice/crio-03e417ed1e2bd8bf8ebc80dd35d42103fc13e0f555222e2cb543949620d05558 WatchSource:0}: Error finding container 03e417ed1e2bd8bf8ebc80dd35d42103fc13e0f555222e2cb543949620d05558: Status 404 returned error can't find the container with id 03e417ed1e2bd8bf8ebc80dd35d42103fc13e0f555222e2cb543949620d05558 Apr 16 17:41:27.985937 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:27.985909 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e7d951d_c7c5_444a_93c5_26faa5766be6.slice/crio-2f455a08ddafdcab86fd44ecedb642e9ae6d0dad0ce01efe22225308c241d25d WatchSource:0}: Error finding container 2f455a08ddafdcab86fd44ecedb642e9ae6d0dad0ce01efe22225308c241d25d: Status 404 returned error can't find the container with id 2f455a08ddafdcab86fd44ecedb642e9ae6d0dad0ce01efe22225308c241d25d Apr 16 17:41:27.986574 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:27.986546 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e15b302_e2d8_4a43_85d6_2c1a3bb9b319.slice/crio-d62948d20b4d9288cf1642a17bf051e9e9c709aecf234602d5cfd6fc5c06c974 WatchSource:0}: Error finding container d62948d20b4d9288cf1642a17bf051e9e9c709aecf234602d5cfd6fc5c06c974: Status 404 returned error can't find the container with id d62948d20b4d9288cf1642a17bf051e9e9c709aecf234602d5cfd6fc5c06c974 Apr 16 17:41:27.988031 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:27.987951 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae5a78d_4165_4422_a612_17627616235f.slice/crio-7c78d990b54311932cfced7d93ae1a6806aa9a1a4f1f35d8c2dd5e279efa32ff WatchSource:0}: Error finding container 7c78d990b54311932cfced7d93ae1a6806aa9a1a4f1f35d8c2dd5e279efa32ff: Status 404 returned error can't find the container with id 7c78d990b54311932cfced7d93ae1a6806aa9a1a4f1f35d8c2dd5e279efa32ff Apr 16 17:41:28.069617 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.069527 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:28.069728 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:28.069668 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:28.069790 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:28.069731 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs podName:38d21ef6-c2df-4bbd-8185-bf4fff5cb835 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:29.069712393 +0000 UTC m=+4.149933520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs") pod "network-metrics-daemon-l7h7z" (UID: "38d21ef6-c2df-4bbd-8185-bf4fff5cb835") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:28.163545 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.163522 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:28.169900 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.169877 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77hw2\" (UniqueName: \"kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2\") pod \"network-check-target-5rkjf\" (UID: \"26163ff9-2d96-4401-962b-735123e76554\") " pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:28.169999 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:28.169988 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:28.170052 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:28.170002 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:28.170052 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:28.170011 2570 projected.go:194] Error preparing data for projected volume kube-api-access-77hw2 for pod openshift-network-diagnostics/network-check-target-5rkjf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:28.170052 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:28.170051 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2 podName:26163ff9-2d96-4401-962b-735123e76554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:29.170036714 +0000 UTC m=+4.250257821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-77hw2" (UniqueName: "kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2") pod "network-check-target-5rkjf" (UID: "26163ff9-2d96-4401-962b-735123e76554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:28.398342 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.398203 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:36:26 +0000 UTC" deadline="2027-12-06 08:32:17.986998845 +0000 UTC" Apr 16 17:41:28.398342 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.398240 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14366h50m49.58876261s" Apr 16 17:41:28.524853 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.524322 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:28.524853 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:28.524452 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:28.540623 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.540571 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal" event={"ID":"045b24ad5fe56c3ffbf4f39e4a48e404","Type":"ContainerStarted","Data":"9abdf72a189a3c7e02a3e3c3e1815ce91ed3ebfe79bc871c9b7a3500de67b0bc"} Apr 16 17:41:28.549176 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.549114 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7mp6" event={"ID":"fb67ec76-fe28-444f-b4f0-51430f30c713","Type":"ContainerStarted","Data":"fb382dc8ed68eb4b9ff521ea55536d3608ed24d3a1d8e1d32bdf000f18d54544"} Apr 16 17:41:28.553579 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.553531 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" event={"ID":"9a25f350-b652-47af-8404-87e373883218","Type":"ContainerStarted","Data":"b7f2ef6fd614bf1c64aecbdec0bd22ab12fa66bc2c778b8ae9e1386429e7d000"} Apr 16 17:41:28.556986 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.556381 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-62.ec2.internal" podStartSLOduration=2.556368064 podStartE2EDuration="2.556368064s" podCreationTimestamp="2026-04-16 17:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:41:28.555693015 +0000 UTC m=+3.635914145" watchObservedRunningTime="2026-04-16 17:41:28.556368064 +0000 UTC m=+3.636589194" Apr 16 17:41:28.560870 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.560825 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x6c67" event={"ID":"aae5a78d-4165-4422-a612-17627616235f","Type":"ContainerStarted","Data":"7c78d990b54311932cfced7d93ae1a6806aa9a1a4f1f35d8c2dd5e279efa32ff"} Apr 16 17:41:28.566431 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.566407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tl2gd" event={"ID":"8e15b302-e2d8-4a43-85d6-2c1a3bb9b319","Type":"ContainerStarted","Data":"d62948d20b4d9288cf1642a17bf051e9e9c709aecf234602d5cfd6fc5c06c974"} Apr 16 17:41:28.573755 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.573727 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" event={"ID":"4e7d951d-c7c5-444a-93c5-26faa5766be6","Type":"ContainerStarted","Data":"2f455a08ddafdcab86fd44ecedb642e9ae6d0dad0ce01efe22225308c241d25d"} Apr 16 17:41:28.577964 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.577926 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7lffp" event={"ID":"0321b210-b7f5-4cfe-9d6d-318f4ba3d299","Type":"ContainerStarted","Data":"03e417ed1e2bd8bf8ebc80dd35d42103fc13e0f555222e2cb543949620d05558"} Apr 16 17:41:28.580144 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.580122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j272p" event={"ID":"25d3965d-906a-4df5-bec1-9edc3c2a8a64","Type":"ContainerStarted","Data":"dae4c06640a0723bdd0ecf48a3d96ef8bdb18935be6a431d1e05330039d77b7a"} Apr 16 17:41:28.582105 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.582085 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerStarted","Data":"7d943b73a4ee50e5c59b70d0c9d8e140168d1013656e586ae66bf6e045665d6b"} Apr 16 17:41:28.587295 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:28.587258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerStarted","Data":"4c6e6112960d505acf73781ee70b58f8adb4bd7e24b85e82f9959f2f99ec2657"} Apr 16 17:41:29.080433 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:29.080406 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:29.080636 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:29.080617 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:29.080701 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:29.080681 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs podName:38d21ef6-c2df-4bbd-8185-bf4fff5cb835 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:31.08066177 +0000 UTC m=+6.160882892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs") pod "network-metrics-daemon-l7h7z" (UID: "38d21ef6-c2df-4bbd-8185-bf4fff5cb835") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:29.181747 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:29.181717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77hw2\" (UniqueName: \"kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2\") pod \"network-check-target-5rkjf\" (UID: \"26163ff9-2d96-4401-962b-735123e76554\") " pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:29.181931 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:29.181915 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:29.181996 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:29.181939 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:29.181996 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:29.181953 2570 projected.go:194] Error preparing data for projected volume kube-api-access-77hw2 for pod openshift-network-diagnostics/network-check-target-5rkjf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:29.182111 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:29.182005 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2 podName:26163ff9-2d96-4401-962b-735123e76554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:31.18198687 +0000 UTC m=+6.262207979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-77hw2" (UniqueName: "kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2") pod "network-check-target-5rkjf" (UID: "26163ff9-2d96-4401-962b-735123e76554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:29.526999 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:29.526973 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:29.527481 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:29.527104 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:29.598928 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:29.598896 2570 generic.go:358] "Generic (PLEG): container finished" podID="f2638ab93f1463c33d45992d535fdc15" containerID="c0f140636276ab50a760cd1054b295b8c270b0ce2c73545b65ad5b2ca99ad9c2" exitCode=0 Apr 16 17:41:29.599764 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:29.599739 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" event={"ID":"f2638ab93f1463c33d45992d535fdc15","Type":"ContainerDied","Data":"c0f140636276ab50a760cd1054b295b8c270b0ce2c73545b65ad5b2ca99ad9c2"} Apr 16 17:41:30.524664 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:30.524476 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:30.524664 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:30.524623 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:30.609551 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:30.608866 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" event={"ID":"f2638ab93f1463c33d45992d535fdc15","Type":"ContainerStarted","Data":"dec614714d829640f5fd459b20ef5057db0b64ac83863f7bafe714c38258351c"} Apr 16 17:41:31.098558 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:31.098498 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:31.098735 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:31.098683 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:31.098797 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:31.098746 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs podName:38d21ef6-c2df-4bbd-8185-bf4fff5cb835 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:35.098727799 +0000 UTC m=+10.178948916 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs") pod "network-metrics-daemon-l7h7z" (UID: "38d21ef6-c2df-4bbd-8185-bf4fff5cb835") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:31.199678 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:31.199649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77hw2\" (UniqueName: \"kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2\") pod \"network-check-target-5rkjf\" (UID: \"26163ff9-2d96-4401-962b-735123e76554\") " pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:31.199899 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:31.199878 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:31.199975 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:31.199909 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:31.199975 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:31.199923 2570 projected.go:194] Error preparing data for projected volume kube-api-access-77hw2 for pod openshift-network-diagnostics/network-check-target-5rkjf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:31.200086 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:31.199981 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2 podName:26163ff9-2d96-4401-962b-735123e76554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:35.199962283 +0000 UTC m=+10.280183404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-77hw2" (UniqueName: "kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2") pod "network-check-target-5rkjf" (UID: "26163ff9-2d96-4401-962b-735123e76554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:31.524768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:31.524692 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:31.524917 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:31.524838 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:32.524066 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:32.524034 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:32.524454 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:32.524162 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:33.526028 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:33.524813 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:33.526028 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:33.524945 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:34.523890 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:34.523852 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:34.524048 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:34.523992 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:35.129162 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:35.129087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:35.129617 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:35.129250 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:35.129617 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:35.129336 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs podName:38d21ef6-c2df-4bbd-8185-bf4fff5cb835 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:43.129314791 +0000 UTC m=+18.209535901 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs") pod "network-metrics-daemon-l7h7z" (UID: "38d21ef6-c2df-4bbd-8185-bf4fff5cb835") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:35.230196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:35.230091 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77hw2\" (UniqueName: \"kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2\") pod \"network-check-target-5rkjf\" (UID: \"26163ff9-2d96-4401-962b-735123e76554\") " pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:35.230356 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:35.230258 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:35.230356 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:35.230282 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:35.230356 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:35.230327 2570 projected.go:194] Error preparing data for projected volume kube-api-access-77hw2 for pod openshift-network-diagnostics/network-check-target-5rkjf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:35.230546 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:35.230394 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2 podName:26163ff9-2d96-4401-962b-735123e76554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:43.230377008 +0000 UTC m=+18.310598122 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-77hw2" (UniqueName: "kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2") pod "network-check-target-5rkjf" (UID: "26163ff9-2d96-4401-962b-735123e76554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:35.526107 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:35.525602 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:35.526107 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:35.525721 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:36.524560 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:36.524526 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:36.524972 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:36.524649 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:37.524647 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:37.524613 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:37.525059 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:37.524750 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:38.524349 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:38.524317 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:38.524598 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:38.524409 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:39.524737 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:39.524667 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:39.525180 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:39.524794 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:40.524208 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:40.524169 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:40.524358 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:40.524278 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:41.524149 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:41.524118 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:41.524615 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:41.524236 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:42.524145 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:42.524118 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:42.524306 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:42.524217 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:43.189155 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:43.189124 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:43.189314 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:43.189266 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:43.189368 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:43.189329 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs podName:38d21ef6-c2df-4bbd-8185-bf4fff5cb835 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:59.189308732 +0000 UTC m=+34.269529844 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs") pod "network-metrics-daemon-l7h7z" (UID: "38d21ef6-c2df-4bbd-8185-bf4fff5cb835") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:43.289897 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:43.289818 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77hw2\" (UniqueName: \"kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2\") pod \"network-check-target-5rkjf\" (UID: \"26163ff9-2d96-4401-962b-735123e76554\") " pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:43.290042 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:43.289968 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:43.290042 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:43.289988 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:43.290042 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:43.290000 2570 projected.go:194] Error preparing data for projected volume kube-api-access-77hw2 for pod openshift-network-diagnostics/network-check-target-5rkjf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:43.290196 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:43.290054 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2 podName:26163ff9-2d96-4401-962b-735123e76554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:59.290036719 +0000 UTC m=+34.370257841 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-77hw2" (UniqueName: "kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2") pod "network-check-target-5rkjf" (UID: "26163ff9-2d96-4401-962b-735123e76554") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:43.524663 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:43.524631 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:43.525076 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:43.524753 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:44.524720 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:44.524690 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:44.525106 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:44.524802 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:45.524405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.524277 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:45.524494 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:45.524464 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:45.633146 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.633113 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j272p" event={"ID":"25d3965d-906a-4df5-bec1-9edc3c2a8a64","Type":"ContainerStarted","Data":"e4b7e9f15cc281ebbcab1258e41b42297aaa6862b996a4205ad747ef1f0ec712"} Apr 16 17:41:45.635291 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.635245 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:41:45.635583 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.635564 2570 generic.go:358] "Generic (PLEG): container finished" podID="6f384fae-bee8-46b2-8fd3-71f7ece4b87e" containerID="6a8ad45f8f1de392878be84e5c243cd147e04d016d156816cf7b0d2ae1cfa511" exitCode=1 Apr 16 17:41:45.635674 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.635637 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerStarted","Data":"6e48e538c1f1d1da96af939f4eefbaaeb024a4e4feef4d28097adc8f300cbd16"} Apr 16 17:41:45.635674 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.635666 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerStarted","Data":"e049cb087421e5f9c93f41f2762c26c52511357f0b081ef303f2c650abdfff14"} Apr 16 17:41:45.635771 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.635680 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerDied","Data":"6a8ad45f8f1de392878be84e5c243cd147e04d016d156816cf7b0d2ae1cfa511"} Apr 16 17:41:45.635771 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.635694 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerStarted","Data":"66ee67529b824fdd30140b5dfa70165e5c9479e6b266b58a164b599e558ea8df"} Apr 16 17:41:45.637040 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.637006 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerStarted","Data":"9b6e101b9679cf763f87a7c950d1c817fb6168b04de9d48747c74f3f322a8858"} Apr 16 17:41:45.638480 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.638450 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7mp6" event={"ID":"fb67ec76-fe28-444f-b4f0-51430f30c713","Type":"ContainerStarted","Data":"421539464c772f9dbb618830e86582f6c793728b6a531787699df4c079d3e0be"} Apr 16 17:41:45.640081 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.640054 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" event={"ID":"9a25f350-b652-47af-8404-87e373883218","Type":"ContainerStarted","Data":"3b4ef5dd0cbd5b7f8ef44ae1fae6394c2e421b69b1474fea66c7ecf0bed41cdc"} Apr 16 17:41:45.641548 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.641522 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x6c67" event={"ID":"aae5a78d-4165-4422-a612-17627616235f","Type":"ContainerStarted","Data":"04e8060167ce15006243599751ddcc74d88f244d8fe7afd318e714766c4e085f"} Apr 16 17:41:45.643012 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.642988 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tl2gd" event={"ID":"8e15b302-e2d8-4a43-85d6-2c1a3bb9b319","Type":"ContainerStarted","Data":"370fa10d0ea371062e2f8a623b6f6b06d370168b11207301b023afd44e516ffb"} Apr 16 17:41:45.644933 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.644912 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" event={"ID":"4e7d951d-c7c5-444a-93c5-26faa5766be6","Type":"ContainerStarted","Data":"9035aa79f281c8b5649ea67b5b96a491409fdcf56ddad2cee6de85fe13d471fe"} Apr 16 17:41:45.648667 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.648619 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-j272p" podStartSLOduration=3.676195622 podStartE2EDuration="20.648605433s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:27.983128706 +0000 UTC m=+3.063349813" lastFinishedPulling="2026-04-16 17:41:44.955538509 +0000 UTC m=+20.035759624" observedRunningTime="2026-04-16 17:41:45.648292859 +0000 UTC m=+20.728513988" watchObservedRunningTime="2026-04-16 17:41:45.648605433 +0000 UTC m=+20.728826563" Apr 16 17:41:45.648816 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.648778 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-62.ec2.internal" podStartSLOduration=19.648769331 podStartE2EDuration="19.648769331s" podCreationTimestamp="2026-04-16 17:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:41:30.626946026 +0000 UTC m=+5.707167156" watchObservedRunningTime="2026-04-16 17:41:45.648769331 +0000 UTC m=+20.728990463" Apr 16 17:41:45.667949 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.667913 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n7mp6" podStartSLOduration=3.661846227 podStartE2EDuration="20.667899728s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:27.99501058 +0000 UTC m=+3.075231690" lastFinishedPulling="2026-04-16 17:41:45.00106406 +0000 UTC m=+20.081285191" observedRunningTime="2026-04-16 17:41:45.667211095 +0000 UTC m=+20.747432214" watchObservedRunningTime="2026-04-16 17:41:45.667899728 +0000 UTC m=+20.748120858" Apr 16 17:41:45.682618 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.682574 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tl2gd" podStartSLOduration=3.7149852709999998 podStartE2EDuration="20.682557878s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:27.988327456 +0000 UTC m=+3.068548567" lastFinishedPulling="2026-04-16 17:41:44.95590006 +0000 UTC m=+20.036121174" observedRunningTime="2026-04-16 17:41:45.681835005 +0000 UTC m=+20.762056136" watchObservedRunningTime="2026-04-16 17:41:45.682557878 +0000 UTC m=+20.762779008" Apr 16 17:41:45.698860 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.698825 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gjbj8" podStartSLOduration=3.732898567 podStartE2EDuration="20.69881296s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:27.991434925 +0000 UTC m=+3.071656042" lastFinishedPulling="2026-04-16 17:41:44.957349323 +0000 UTC m=+20.037570435" observedRunningTime="2026-04-16 17:41:45.698810017 +0000 UTC m=+20.779031144" watchObservedRunningTime="2026-04-16 17:41:45.69881296 +0000 UTC m=+20.779034087" Apr 16 17:41:45.739897 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:45.739770 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x6c67" podStartSLOduration=8.385650154 podStartE2EDuration="20.739756321s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:27.990936912 +0000 UTC m=+3.071158032" lastFinishedPulling="2026-04-16 17:41:40.345043087 +0000 UTC m=+15.425264199" observedRunningTime="2026-04-16 17:41:45.71668539 +0000 UTC m=+20.796906521" watchObservedRunningTime="2026-04-16 17:41:45.739756321 +0000 UTC m=+20.819977505" Apr 16 17:41:46.524165 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.524140 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:46.524271 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:46.524244 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:46.647635 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.647607 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7lffp" event={"ID":"0321b210-b7f5-4cfe-9d6d-318f4ba3d299","Type":"ContainerStarted","Data":"8baa101fbedf2bc82038261c13f7ebeee202c252afc104747ba96d02f34799c6"} Apr 16 17:41:46.649739 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.649723 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:41:46.650027 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.650008 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerStarted","Data":"bc72b43aa02bbf921f7bdfee41278c7be3d87bf36592a99ebff6afa76762eb5e"} Apr 16 17:41:46.650075 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.650035 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerStarted","Data":"020fd164fc8ce0450127ca6012a79ac4778b2146157ffed55b76970c160f71b3"} Apr 16 17:41:46.651117 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.651099 2570 generic.go:358] "Generic (PLEG): container finished" podID="29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8" containerID="9b6e101b9679cf763f87a7c950d1c817fb6168b04de9d48747c74f3f322a8858" exitCode=0 Apr 16 17:41:46.651231 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.651198 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerDied","Data":"9b6e101b9679cf763f87a7c950d1c817fb6168b04de9d48747c74f3f322a8858"} Apr 16 17:41:46.663193 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.663155 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7lffp" podStartSLOduration=4.705792698 podStartE2EDuration="21.663142269s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:27.984953592 +0000 UTC m=+3.065174715" lastFinishedPulling="2026-04-16 17:41:44.942303176 +0000 UTC m=+20.022524286" observedRunningTime="2026-04-16 17:41:46.662672689 +0000 UTC m=+21.742893820" watchObservedRunningTime="2026-04-16 17:41:46.663142269 +0000 UTC m=+21.743363399" Apr 16 17:41:46.678834 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.678817 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 17:41:46.884459 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:46.884437 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:47.430877 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:47.430759 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T17:41:46.67883031Z","UUID":"70f74969-3679-4253-90f1-5373da09b43c","Handler":null,"Name":"","Endpoint":""} Apr 16 17:41:47.433975 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:47.433943 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 17:41:47.433975 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:47.433971 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 17:41:47.524782 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:47.524748 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:47.524941 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:47.524917 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:47.655021 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:47.654989 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" event={"ID":"4e7d951d-c7c5-444a-93c5-26faa5766be6","Type":"ContainerStarted","Data":"01651405516a977c0fcd6db04c463a3bf6c2fde015181b8db755f0690ad7ce2c"} Apr 16 17:41:48.028196 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:48.028139 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:48.028873 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:48.028852 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:48.523930 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:48.523744 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:48.524048 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:48.523953 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:48.658154 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:48.658122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" event={"ID":"4e7d951d-c7c5-444a-93c5-26faa5766be6","Type":"ContainerStarted","Data":"f26c1f4aa644f44378513f730f6763156af28d88fad7f01b83cb530963325627"} Apr 16 17:41:48.661108 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:48.661086 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:41:48.661474 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:48.661436 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerStarted","Data":"9bf7198dab966cbce4c7de2afed3fa6b4399f9ddd5234959f3fa03c675023ac7"} Apr 16 17:41:48.662229 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:48.662202 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tl2gd" Apr 16 17:41:48.680468 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:48.680432 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sm5r9" podStartSLOduration=3.65415641 podStartE2EDuration="23.680420107s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:27.987492804 +0000 UTC m=+3.067713918" lastFinishedPulling="2026-04-16 17:41:48.013756501 +0000 UTC m=+23.093977615" observedRunningTime="2026-04-16 17:41:48.675786508 +0000 UTC m=+23.756007636" watchObservedRunningTime="2026-04-16 17:41:48.680420107 +0000 UTC m=+23.760641236" Apr 16 17:41:49.523779 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:49.523743 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:49.523989 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:49.523870 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:50.523880 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:50.523849 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:50.524401 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:50.523962 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:51.524017 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:51.523761 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:51.524458 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:51.524088 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:51.669248 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:51.669228 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:41:51.669617 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:51.669592 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerStarted","Data":"6de256f2d82e9ccd00bc99e1b2ba329e3fdfa06c1dbe98a78dba1f664abc4837"} Apr 16 17:41:51.669895 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:51.669878 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:51.669979 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:51.669906 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:51.670088 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:51.670073 2570 scope.go:117] "RemoveContainer" containerID="6a8ad45f8f1de392878be84e5c243cd147e04d016d156816cf7b0d2ae1cfa511" Apr 16 17:41:51.671398 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:51.671372 2570 generic.go:358] "Generic (PLEG): container finished" podID="29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8" containerID="10c1031045755b204078bdbb522db49fa95d5f7af5d65535d9d67bd451546cc1" exitCode=0 Apr 16 17:41:51.671483 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:51.671423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerDied","Data":"10c1031045755b204078bdbb522db49fa95d5f7af5d65535d9d67bd451546cc1"} Apr 16 17:41:51.684826 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:51.684806 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:52.524771 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:52.524715 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:52.525064 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:52.524810 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:52.678297 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:52.678274 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:41:52.678650 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:52.678630 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" event={"ID":"6f384fae-bee8-46b2-8fd3-71f7ece4b87e","Type":"ContainerStarted","Data":"ca6aaf7d547f613b27795fb7b3deda414fb113d9dfea46ce252fe7833de3dbf4"} Apr 16 17:41:52.678996 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:52.678961 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:52.680878 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:52.680853 2570 generic.go:358] "Generic (PLEG): container finished" podID="29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8" containerID="949c1ec6ad6eedb577b86ddd179fb8dfad77a86d33e44fa8051e48a84a3d7a31" exitCode=0 Apr 16 17:41:52.681012 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:52.680890 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerDied","Data":"949c1ec6ad6eedb577b86ddd179fb8dfad77a86d33e44fa8051e48a84a3d7a31"} Apr 16 17:41:52.693197 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:52.693180 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:41:52.709710 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:52.709661 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" podStartSLOduration=10.671471238 podStartE2EDuration="27.709645014s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:27.9946896 +0000 UTC m=+3.074910711" lastFinishedPulling="2026-04-16 17:41:45.032863369 +0000 UTC m=+20.113084487" observedRunningTime="2026-04-16 17:41:52.707951382 +0000 UTC m=+27.788172538" watchObservedRunningTime="2026-04-16 17:41:52.709645014 +0000 UTC m=+27.789866146" Apr 16 17:41:53.053873 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:53.053278 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l7h7z"] Apr 16 17:41:53.053873 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:53.053415 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:53.053873 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:53.053539 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:53.056302 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:53.056278 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5rkjf"] Apr 16 17:41:53.056434 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:53.056364 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:53.056501 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:53.056456 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:53.684392 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:53.684365 2570 generic.go:358] "Generic (PLEG): container finished" podID="29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8" containerID="2e7494e0cbb0e84cce8ba4569ddd38c461cb45f48f8bf807469a53b29883b007" exitCode=0 Apr 16 17:41:53.684749 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:53.684449 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerDied","Data":"2e7494e0cbb0e84cce8ba4569ddd38c461cb45f48f8bf807469a53b29883b007"} Apr 16 17:41:54.524605 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:54.524282 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:54.524605 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:54.524324 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:54.524809 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:54.524666 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:54.524809 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:54.524744 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:56.524449 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:56.524419 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:56.524971 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:56.524424 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:56.524971 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:56.524537 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5rkjf" podUID="26163ff9-2d96-4401-962b-735123e76554" Apr 16 17:41:56.524971 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:56.524653 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:41:58.275835 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.275754 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-62.ec2.internal" event="NodeReady" Apr 16 17:41:58.276340 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.275865 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 17:41:58.346807 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.346777 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-95rfp"] Apr 16 17:41:58.349402 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.349376 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.349727 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.349704 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4ctsq"] Apr 16 17:41:58.351275 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.351258 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:41:58.352294 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.352273 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2vlxb\"" Apr 16 17:41:58.352569 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.352546 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 17:41:58.352663 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.352590 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 17:41:58.353714 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.353696 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 17:41:58.353814 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.353799 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 17:41:58.353879 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.353827 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 17:41:58.353999 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.353864 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w8phs\"" Apr 16 17:41:58.359592 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.359556 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-95rfp"] Apr 16 17:41:58.361374 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.361354 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4ctsq"] Apr 16 17:41:58.508118 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.508082 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgwk7\" (UniqueName: \"kubernetes.io/projected/64a21724-619d-4a2f-b61f-bf293308211b-kube-api-access-jgwk7\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:41:58.508268 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.508135 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-config-volume\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.508268 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.508203 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.508268 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.508253 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:41:58.508385 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.508294 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97tlh\" (UniqueName: \"kubernetes.io/projected/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-kube-api-access-97tlh\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.508385 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.508326 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-tmp-dir\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.524200 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.524174 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:58.524318 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.524181 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:58.527162 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.527042 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:41:58.527265 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.527178 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 17:41:58.527265 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.527211 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 17:41:58.527354 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.527272 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8xx6l\"" Apr 16 17:41:58.527429 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.527410 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qnscp\"" Apr 16 17:41:58.608831 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.608805 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.608960 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.608844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:41:58.608960 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.608883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97tlh\" (UniqueName: \"kubernetes.io/projected/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-kube-api-access-97tlh\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.608960 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.608902 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-tmp-dir\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.608960 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.608936 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgwk7\" (UniqueName: \"kubernetes.io/projected/64a21724-619d-4a2f-b61f-bf293308211b-kube-api-access-jgwk7\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:41:58.609161 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:58.608968 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:41:58.609161 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:58.609031 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:41:58.609161 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:58.609041 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls podName:cca91f7c-ca0a-4c7a-97bc-93c7e35d6271 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:59.109019712 +0000 UTC m=+34.189240823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls") pod "dns-default-95rfp" (UID: "cca91f7c-ca0a-4c7a-97bc-93c7e35d6271") : secret "dns-default-metrics-tls" not found Apr 16 17:41:58.609161 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:58.609076 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert podName:64a21724-619d-4a2f-b61f-bf293308211b nodeName:}" failed. No retries permitted until 2026-04-16 17:41:59.109062542 +0000 UTC m=+34.189283656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert") pod "ingress-canary-4ctsq" (UID: "64a21724-619d-4a2f-b61f-bf293308211b") : secret "canary-serving-cert" not found Apr 16 17:41:58.609161 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.608968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-config-volume\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.609459 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.609274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-tmp-dir\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.609519 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.609472 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-config-volume\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.620486 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.620461 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97tlh\" (UniqueName: \"kubernetes.io/projected/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-kube-api-access-97tlh\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:58.620602 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:58.620592 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgwk7\" (UniqueName: \"kubernetes.io/projected/64a21724-619d-4a2f-b61f-bf293308211b-kube-api-access-jgwk7\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:41:59.112500 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:59.112473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:41:59.112500 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:59.112522 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:41:59.112697 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:59.112600 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:41:59.112697 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:59.112604 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:41:59.112697 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:59.112648 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert podName:64a21724-619d-4a2f-b61f-bf293308211b nodeName:}" failed. No retries permitted until 2026-04-16 17:42:00.112634402 +0000 UTC m=+35.192855510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert") pod "ingress-canary-4ctsq" (UID: "64a21724-619d-4a2f-b61f-bf293308211b") : secret "canary-serving-cert" not found Apr 16 17:41:59.112697 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:59.112660 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls podName:cca91f7c-ca0a-4c7a-97bc-93c7e35d6271 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:00.112654564 +0000 UTC m=+35.192875671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls") pod "dns-default-95rfp" (UID: "cca91f7c-ca0a-4c7a-97bc-93c7e35d6271") : secret "dns-default-metrics-tls" not found Apr 16 17:41:59.213480 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:59.213455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:41:59.213670 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:59.213595 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 17:41:59.213670 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:41:59.213651 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs podName:38d21ef6-c2df-4bbd-8185-bf4fff5cb835 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:31.213635075 +0000 UTC m=+66.293856191 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs") pod "network-metrics-daemon-l7h7z" (UID: "38d21ef6-c2df-4bbd-8185-bf4fff5cb835") : secret "metrics-daemon-secret" not found Apr 16 17:41:59.314074 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:59.314047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77hw2\" (UniqueName: \"kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2\") pod \"network-check-target-5rkjf\" (UID: \"26163ff9-2d96-4401-962b-735123e76554\") " pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:59.316282 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:59.316259 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77hw2\" (UniqueName: \"kubernetes.io/projected/26163ff9-2d96-4401-962b-735123e76554-kube-api-access-77hw2\") pod \"network-check-target-5rkjf\" (UID: \"26163ff9-2d96-4401-962b-735123e76554\") " pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:59.436685 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:59.436637 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:41:59.607638 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:59.607439 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5rkjf"] Apr 16 17:41:59.614877 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:41:59.614846 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26163ff9_2d96_4401_962b_735123e76554.slice/crio-83ddf1cc6b5229b49e84f392108fd87947488d724aebf601bff72f732c86927c WatchSource:0}: Error finding container 83ddf1cc6b5229b49e84f392108fd87947488d724aebf601bff72f732c86927c: Status 404 returned error can't find the container with id 83ddf1cc6b5229b49e84f392108fd87947488d724aebf601bff72f732c86927c Apr 16 17:41:59.698530 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:59.698405 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerStarted","Data":"59a660bc89ce7a8fb0e79554522b277902acf994a7f7c5924a846688ce892ebb"} Apr 16 17:41:59.699591 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:41:59.699565 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5rkjf" event={"ID":"26163ff9-2d96-4401-962b-735123e76554","Type":"ContainerStarted","Data":"83ddf1cc6b5229b49e84f392108fd87947488d724aebf601bff72f732c86927c"} Apr 16 17:42:00.119735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:00.119705 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:42:00.119844 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:00.119755 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:42:00.119844 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:00.119835 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:42:00.119943 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:00.119872 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:42:00.119943 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:00.119894 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls podName:cca91f7c-ca0a-4c7a-97bc-93c7e35d6271 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:02.119877512 +0000 UTC m=+37.200098625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls") pod "dns-default-95rfp" (UID: "cca91f7c-ca0a-4c7a-97bc-93c7e35d6271") : secret "dns-default-metrics-tls" not found Apr 16 17:42:00.119943 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:00.119918 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert podName:64a21724-619d-4a2f-b61f-bf293308211b nodeName:}" failed. No retries permitted until 2026-04-16 17:42:02.119902001 +0000 UTC m=+37.200123113 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert") pod "ingress-canary-4ctsq" (UID: "64a21724-619d-4a2f-b61f-bf293308211b") : secret "canary-serving-cert" not found Apr 16 17:42:00.704291 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:00.704260 2570 generic.go:358] "Generic (PLEG): container finished" podID="29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8" containerID="59a660bc89ce7a8fb0e79554522b277902acf994a7f7c5924a846688ce892ebb" exitCode=0 Apr 16 17:42:00.704722 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:00.704299 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerDied","Data":"59a660bc89ce7a8fb0e79554522b277902acf994a7f7c5924a846688ce892ebb"} Apr 16 17:42:01.709919 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:01.709888 2570 generic.go:358] "Generic (PLEG): container finished" podID="29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8" containerID="54defa4124c0f4f3ff032f525cc6501f3bb843bd34193966fbd6c1f6267cf84b" exitCode=0 Apr 16 17:42:01.710379 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:01.709952 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerDied","Data":"54defa4124c0f4f3ff032f525cc6501f3bb843bd34193966fbd6c1f6267cf84b"} Apr 16 17:42:02.135630 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:02.135589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:42:02.135827 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:02.135649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:42:02.135827 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:02.135763 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:42:02.135827 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:02.135789 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:42:02.135985 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:02.135847 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls podName:cca91f7c-ca0a-4c7a-97bc-93c7e35d6271 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:06.135823893 +0000 UTC m=+41.216045046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls") pod "dns-default-95rfp" (UID: "cca91f7c-ca0a-4c7a-97bc-93c7e35d6271") : secret "dns-default-metrics-tls" not found Apr 16 17:42:02.135985 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:02.135867 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert podName:64a21724-619d-4a2f-b61f-bf293308211b nodeName:}" failed. No retries permitted until 2026-04-16 17:42:06.13585869 +0000 UTC m=+41.216079800 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert") pod "ingress-canary-4ctsq" (UID: "64a21724-619d-4a2f-b61f-bf293308211b") : secret "canary-serving-cert" not found Apr 16 17:42:02.715150 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:02.714948 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" event={"ID":"29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8","Type":"ContainerStarted","Data":"0a4ebb77b22b921e7c831d09829f3a78c053774675fa390b3cff05505b5ce0f9"} Apr 16 17:42:02.739776 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:02.739734 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jdzwz" podStartSLOduration=6.286779709 podStartE2EDuration="37.739720689s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:27.991854979 +0000 UTC m=+3.072076098" lastFinishedPulling="2026-04-16 17:41:59.444795962 +0000 UTC m=+34.525017078" observedRunningTime="2026-04-16 17:42:02.738038161 +0000 UTC m=+37.818259289" watchObservedRunningTime="2026-04-16 17:42:02.739720689 +0000 UTC m=+37.819941817" Apr 16 17:42:03.720102 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:03.720062 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5rkjf" event={"ID":"26163ff9-2d96-4401-962b-735123e76554","Type":"ContainerStarted","Data":"0bd0f769a515ea8eb9ed6c3849dc40f2630107439bedf609154ca5f80a63e190"} Apr 16 17:42:03.720580 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:03.720343 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:42:03.735821 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:03.735783 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5rkjf" podStartSLOduration=35.727842329 podStartE2EDuration="38.735770438s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:59.616702783 +0000 UTC m=+34.696923893" lastFinishedPulling="2026-04-16 17:42:02.624630878 +0000 UTC m=+37.704852002" observedRunningTime="2026-04-16 17:42:03.735594241 +0000 UTC m=+38.815815370" watchObservedRunningTime="2026-04-16 17:42:03.735770438 +0000 UTC m=+38.815991566" Apr 16 17:42:06.163009 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:06.162970 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:42:06.163009 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:06.163012 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:42:06.163445 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:06.163100 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:42:06.163445 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:06.163111 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:42:06.163445 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:06.163147 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert podName:64a21724-619d-4a2f-b61f-bf293308211b nodeName:}" failed. No retries permitted until 2026-04-16 17:42:14.163133627 +0000 UTC m=+49.243354734 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert") pod "ingress-canary-4ctsq" (UID: "64a21724-619d-4a2f-b61f-bf293308211b") : secret "canary-serving-cert" not found Apr 16 17:42:06.163445 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:06.163170 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls podName:cca91f7c-ca0a-4c7a-97bc-93c7e35d6271 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:14.163152607 +0000 UTC m=+49.243373725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls") pod "dns-default-95rfp" (UID: "cca91f7c-ca0a-4c7a-97bc-93c7e35d6271") : secret "dns-default-metrics-tls" not found Apr 16 17:42:07.689924 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.689896 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd"] Apr 16 17:42:07.692513 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.692489 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" Apr 16 17:42:07.696550 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.696500 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 17:42:07.696550 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.696527 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 17:42:07.696550 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.696527 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 17:42:07.697594 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.697568 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 17:42:07.697862 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.697844 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-wk4lg\"" Apr 16 17:42:07.700097 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.700075 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt"] Apr 16 17:42:07.704812 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.704789 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd"] Apr 16 17:42:07.704910 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.704898 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:07.705310 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.705291 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt"] Apr 16 17:42:07.707445 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.707427 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 17:42:07.874053 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.874027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9706163e-8fe9-4ea7-9c1e-2114a03c13f2-tmp\") pod \"klusterlet-addon-workmgr-659648565c-cvtzt\" (UID: \"9706163e-8fe9-4ea7-9c1e-2114a03c13f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:07.874139 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.874065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/871088ca-9a0a-4901-b6b4-8f72389ab255-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5954cd7dff-kxbzd\" (UID: \"871088ca-9a0a-4901-b6b4-8f72389ab255\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" Apr 16 17:42:07.874139 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.874095 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrf5h\" (UniqueName: \"kubernetes.io/projected/871088ca-9a0a-4901-b6b4-8f72389ab255-kube-api-access-jrf5h\") pod \"managed-serviceaccount-addon-agent-5954cd7dff-kxbzd\" (UID: \"871088ca-9a0a-4901-b6b4-8f72389ab255\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" Apr 16 17:42:07.874139 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.874122 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9706163e-8fe9-4ea7-9c1e-2114a03c13f2-klusterlet-config\") pod \"klusterlet-addon-workmgr-659648565c-cvtzt\" (UID: \"9706163e-8fe9-4ea7-9c1e-2114a03c13f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:07.874285 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.874148 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtsps\" (UniqueName: \"kubernetes.io/projected/9706163e-8fe9-4ea7-9c1e-2114a03c13f2-kube-api-access-jtsps\") pod \"klusterlet-addon-workmgr-659648565c-cvtzt\" (UID: \"9706163e-8fe9-4ea7-9c1e-2114a03c13f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:07.974522 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.974458 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9706163e-8fe9-4ea7-9c1e-2114a03c13f2-tmp\") pod \"klusterlet-addon-workmgr-659648565c-cvtzt\" (UID: \"9706163e-8fe9-4ea7-9c1e-2114a03c13f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:07.974522 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.974499 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/871088ca-9a0a-4901-b6b4-8f72389ab255-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5954cd7dff-kxbzd\" (UID: \"871088ca-9a0a-4901-b6b4-8f72389ab255\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" Apr 16 17:42:07.974622 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.974548 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrf5h\" (UniqueName: \"kubernetes.io/projected/871088ca-9a0a-4901-b6b4-8f72389ab255-kube-api-access-jrf5h\") pod \"managed-serviceaccount-addon-agent-5954cd7dff-kxbzd\" (UID: \"871088ca-9a0a-4901-b6b4-8f72389ab255\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" Apr 16 17:42:07.974622 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.974576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9706163e-8fe9-4ea7-9c1e-2114a03c13f2-klusterlet-config\") pod \"klusterlet-addon-workmgr-659648565c-cvtzt\" (UID: \"9706163e-8fe9-4ea7-9c1e-2114a03c13f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:07.974622 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.974604 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtsps\" (UniqueName: \"kubernetes.io/projected/9706163e-8fe9-4ea7-9c1e-2114a03c13f2-kube-api-access-jtsps\") pod \"klusterlet-addon-workmgr-659648565c-cvtzt\" (UID: \"9706163e-8fe9-4ea7-9c1e-2114a03c13f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:07.974860 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.974843 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9706163e-8fe9-4ea7-9c1e-2114a03c13f2-tmp\") pod \"klusterlet-addon-workmgr-659648565c-cvtzt\" (UID: \"9706163e-8fe9-4ea7-9c1e-2114a03c13f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:07.978623 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.978597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/871088ca-9a0a-4901-b6b4-8f72389ab255-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5954cd7dff-kxbzd\" (UID: \"871088ca-9a0a-4901-b6b4-8f72389ab255\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" Apr 16 17:42:07.978703 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.978597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9706163e-8fe9-4ea7-9c1e-2114a03c13f2-klusterlet-config\") pod \"klusterlet-addon-workmgr-659648565c-cvtzt\" (UID: \"9706163e-8fe9-4ea7-9c1e-2114a03c13f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:07.983622 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.983597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrf5h\" (UniqueName: \"kubernetes.io/projected/871088ca-9a0a-4901-b6b4-8f72389ab255-kube-api-access-jrf5h\") pod \"managed-serviceaccount-addon-agent-5954cd7dff-kxbzd\" (UID: \"871088ca-9a0a-4901-b6b4-8f72389ab255\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" Apr 16 17:42:07.983740 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:07.983720 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtsps\" (UniqueName: \"kubernetes.io/projected/9706163e-8fe9-4ea7-9c1e-2114a03c13f2-kube-api-access-jtsps\") pod \"klusterlet-addon-workmgr-659648565c-cvtzt\" (UID: \"9706163e-8fe9-4ea7-9c1e-2114a03c13f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:08.016518 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:08.016491 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" Apr 16 17:42:08.024070 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:08.024050 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:08.143014 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:08.142985 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd"] Apr 16 17:42:08.146244 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:42:08.146213 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod871088ca_9a0a_4901_b6b4_8f72389ab255.slice/crio-95694bc4760d0ed9bcccfab34024acd48f3d625f4176d09b1f27f54accbc2ff5 WatchSource:0}: Error finding container 95694bc4760d0ed9bcccfab34024acd48f3d625f4176d09b1f27f54accbc2ff5: Status 404 returned error can't find the container with id 95694bc4760d0ed9bcccfab34024acd48f3d625f4176d09b1f27f54accbc2ff5 Apr 16 17:42:08.175532 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:08.175493 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt"] Apr 16 17:42:08.178021 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:42:08.177999 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9706163e_8fe9_4ea7_9c1e_2114a03c13f2.slice/crio-262362cfcff4a463b0fa046d14380331a8d575af09d899e255e1b61b7de763ca WatchSource:0}: Error finding container 262362cfcff4a463b0fa046d14380331a8d575af09d899e255e1b61b7de763ca: Status 404 returned error can't find the container with id 262362cfcff4a463b0fa046d14380331a8d575af09d899e255e1b61b7de763ca Apr 16 17:42:08.729362 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:08.729332 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" event={"ID":"9706163e-8fe9-4ea7-9c1e-2114a03c13f2","Type":"ContainerStarted","Data":"262362cfcff4a463b0fa046d14380331a8d575af09d899e255e1b61b7de763ca"} Apr 16 17:42:08.730202 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:08.730186 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" event={"ID":"871088ca-9a0a-4901-b6b4-8f72389ab255","Type":"ContainerStarted","Data":"95694bc4760d0ed9bcccfab34024acd48f3d625f4176d09b1f27f54accbc2ff5"} Apr 16 17:42:13.740673 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:13.740640 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" event={"ID":"9706163e-8fe9-4ea7-9c1e-2114a03c13f2","Type":"ContainerStarted","Data":"2e6e3cfe87f88f1d9833e6f0fce4537321bd96dbfd936cef40fd4235344511ab"} Apr 16 17:42:13.741144 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:13.740840 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:13.741994 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:13.741962 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" event={"ID":"871088ca-9a0a-4901-b6b4-8f72389ab255","Type":"ContainerStarted","Data":"1021f475d4c0ea48eb4f335659697f69743a8ae96cadbdd094180747791a1405"} Apr 16 17:42:13.742408 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:13.742392 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:42:13.758058 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:13.758020 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" podStartSLOduration=2.005669399 podStartE2EDuration="6.758008945s" podCreationTimestamp="2026-04-16 17:42:07 +0000 UTC" firstStartedPulling="2026-04-16 17:42:08.179466759 +0000 UTC m=+43.259687867" lastFinishedPulling="2026-04-16 17:42:12.931806306 +0000 UTC m=+48.012027413" observedRunningTime="2026-04-16 17:42:13.757307645 +0000 UTC m=+48.837528773" watchObservedRunningTime="2026-04-16 17:42:13.758008945 +0000 UTC m=+48.838230068" Apr 16 17:42:13.772428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:13.772386 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" podStartSLOduration=2.002867937 podStartE2EDuration="6.772374599s" podCreationTimestamp="2026-04-16 17:42:07 +0000 UTC" firstStartedPulling="2026-04-16 17:42:08.148138419 +0000 UTC m=+43.228359526" lastFinishedPulling="2026-04-16 17:42:12.917645081 +0000 UTC m=+47.997866188" observedRunningTime="2026-04-16 17:42:13.771672286 +0000 UTC m=+48.851893415" watchObservedRunningTime="2026-04-16 17:42:13.772374599 +0000 UTC m=+48.852595728" Apr 16 17:42:14.224628 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:14.224606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:42:14.224714 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:14.224637 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:42:14.224753 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:14.224720 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:42:14.224788 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:14.224765 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls podName:cca91f7c-ca0a-4c7a-97bc-93c7e35d6271 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:30.224751782 +0000 UTC m=+65.304972894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls") pod "dns-default-95rfp" (UID: "cca91f7c-ca0a-4c7a-97bc-93c7e35d6271") : secret "dns-default-metrics-tls" not found Apr 16 17:42:14.224829 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:14.224721 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:42:14.224872 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:14.224863 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert podName:64a21724-619d-4a2f-b61f-bf293308211b nodeName:}" failed. No retries permitted until 2026-04-16 17:42:30.224848468 +0000 UTC m=+65.305069576 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert") pod "ingress-canary-4ctsq" (UID: "64a21724-619d-4a2f-b61f-bf293308211b") : secret "canary-serving-cert" not found Apr 16 17:42:24.699609 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:24.699579 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9mxm" Apr 16 17:42:30.323841 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:30.323806 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:42:30.323841 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:30.323844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:42:30.324349 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:30.323931 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:42:30.324349 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:30.323949 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:42:30.324349 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:30.323986 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert podName:64a21724-619d-4a2f-b61f-bf293308211b nodeName:}" failed. No retries permitted until 2026-04-16 17:43:02.323973799 +0000 UTC m=+97.404194906 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert") pod "ingress-canary-4ctsq" (UID: "64a21724-619d-4a2f-b61f-bf293308211b") : secret "canary-serving-cert" not found Apr 16 17:42:30.324349 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:30.324004 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls podName:cca91f7c-ca0a-4c7a-97bc-93c7e35d6271 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:02.32399187 +0000 UTC m=+97.404212977 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls") pod "dns-default-95rfp" (UID: "cca91f7c-ca0a-4c7a-97bc-93c7e35d6271") : secret "dns-default-metrics-tls" not found Apr 16 17:42:31.229290 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:31.229263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:42:31.229428 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:31.229378 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 17:42:31.229468 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:42:31.229434 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs podName:38d21ef6-c2df-4bbd-8185-bf4fff5cb835 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:35.229422465 +0000 UTC m=+130.309643572 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs") pod "network-metrics-daemon-l7h7z" (UID: "38d21ef6-c2df-4bbd-8185-bf4fff5cb835") : secret "metrics-daemon-secret" not found Apr 16 17:42:34.724448 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:42:34.724420 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5rkjf" Apr 16 17:43:02.336408 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:02.336376 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:43:02.336408 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:02.336412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:43:02.336903 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:02.336518 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:43:02.336903 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:02.336577 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls podName:cca91f7c-ca0a-4c7a-97bc-93c7e35d6271 nodeName:}" failed. No retries permitted until 2026-04-16 17:44:06.336563568 +0000 UTC m=+161.416784675 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls") pod "dns-default-95rfp" (UID: "cca91f7c-ca0a-4c7a-97bc-93c7e35d6271") : secret "dns-default-metrics-tls" not found Apr 16 17:43:02.336903 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:02.336519 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:43:02.336903 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:02.336648 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert podName:64a21724-619d-4a2f-b61f-bf293308211b nodeName:}" failed. No retries permitted until 2026-04-16 17:44:06.336636322 +0000 UTC m=+161.416857429 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert") pod "ingress-canary-4ctsq" (UID: "64a21724-619d-4a2f-b61f-bf293308211b") : secret "canary-serving-cert" not found Apr 16 17:43:32.958667 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:32.958640 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-j272p_25d3965d-906a-4df5-bec1-9edc3c2a8a64/dns-node-resolver/0.log" Apr 16 17:43:34.158554 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:34.158528 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x6c67_aae5a78d-4165-4422-a612-17627616235f/node-ca/0.log" Apr 16 17:43:35.245604 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:35.245573 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:43:35.245944 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:35.245715 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 17:43:35.245944 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:35.245770 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs podName:38d21ef6-c2df-4bbd-8185-bf4fff5cb835 nodeName:}" failed. No retries permitted until 2026-04-16 17:45:37.245755277 +0000 UTC m=+252.325976383 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs") pod "network-metrics-daemon-l7h7z" (UID: "38d21ef6-c2df-4bbd-8185-bf4fff5cb835") : secret "metrics-daemon-secret" not found Apr 16 17:43:37.609647 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.609620 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9"] Apr 16 17:43:37.612265 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.612250 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9" Apr 16 17:43:37.614724 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.614703 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 17:43:37.615840 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.615242 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:43:37.618662 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.618521 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-xl7cn\"" Apr 16 17:43:37.619587 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.619567 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9"] Apr 16 17:43:37.725343 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.725323 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-4977r"] Apr 16 17:43:37.728027 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.728015 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.730286 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.730270 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 17:43:37.730462 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.730449 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 17:43:37.730579 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.730562 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 17:43:37.730910 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.730896 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:43:37.731014 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.730998 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-46r2z\"" Apr 16 17:43:37.736135 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.736113 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 17:43:37.740345 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.740326 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-4977r"] Apr 16 17:43:37.763292 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.763271 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzf9h\" (UniqueName: \"kubernetes.io/projected/bae703ff-9612-493f-8d86-c275eac39802-kube-api-access-tzf9h\") pod \"volume-data-source-validator-7d955d5dd4-zqvq9\" (UID: \"bae703ff-9612-493f-8d86-c275eac39802\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9" Apr 16 17:43:37.863918 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.863864 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c31261b-9dd1-44b8-b6ad-4092f61c1883-serving-cert\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.863918 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.863895 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpvjs\" (UniqueName: \"kubernetes.io/projected/7c31261b-9dd1-44b8-b6ad-4092f61c1883-kube-api-access-lpvjs\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.864057 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.863958 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c31261b-9dd1-44b8-b6ad-4092f61c1883-config\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.864057 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.863980 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c31261b-9dd1-44b8-b6ad-4092f61c1883-trusted-ca\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.864057 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.864005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzf9h\" (UniqueName: \"kubernetes.io/projected/bae703ff-9612-493f-8d86-c275eac39802-kube-api-access-tzf9h\") pod \"volume-data-source-validator-7d955d5dd4-zqvq9\" (UID: \"bae703ff-9612-493f-8d86-c275eac39802\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9" Apr 16 17:43:37.872554 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.872534 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzf9h\" (UniqueName: \"kubernetes.io/projected/bae703ff-9612-493f-8d86-c275eac39802-kube-api-access-tzf9h\") pod \"volume-data-source-validator-7d955d5dd4-zqvq9\" (UID: \"bae703ff-9612-493f-8d86-c275eac39802\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9" Apr 16 17:43:37.924543 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.924522 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9" Apr 16 17:43:37.964289 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.964265 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c31261b-9dd1-44b8-b6ad-4092f61c1883-serving-cert\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.964408 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.964296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvjs\" (UniqueName: \"kubernetes.io/projected/7c31261b-9dd1-44b8-b6ad-4092f61c1883-kube-api-access-lpvjs\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.964408 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.964345 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c31261b-9dd1-44b8-b6ad-4092f61c1883-config\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.964408 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.964365 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c31261b-9dd1-44b8-b6ad-4092f61c1883-trusted-ca\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.965166 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.965115 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c31261b-9dd1-44b8-b6ad-4092f61c1883-config\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.965278 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.965260 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c31261b-9dd1-44b8-b6ad-4092f61c1883-trusted-ca\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.966635 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.966619 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c31261b-9dd1-44b8-b6ad-4092f61c1883-serving-cert\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:37.974636 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:37.974606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvjs\" (UniqueName: \"kubernetes.io/projected/7c31261b-9dd1-44b8-b6ad-4092f61c1883-kube-api-access-lpvjs\") pod \"console-operator-d87b8d5fc-4977r\" (UID: \"7c31261b-9dd1-44b8-b6ad-4092f61c1883\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:38.037165 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:38.037068 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:38.038319 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:38.038302 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9"] Apr 16 17:43:38.044387 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:43:38.044363 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbae703ff_9612_493f_8d86_c275eac39802.slice/crio-2d60381376b07f8db566c8dc8c92c6b1002b922262ad39a6a445db7329f19404 WatchSource:0}: Error finding container 2d60381376b07f8db566c8dc8c92c6b1002b922262ad39a6a445db7329f19404: Status 404 returned error can't find the container with id 2d60381376b07f8db566c8dc8c92c6b1002b922262ad39a6a445db7329f19404 Apr 16 17:43:38.150981 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:38.150920 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-4977r"] Apr 16 17:43:38.153798 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:43:38.153769 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c31261b_9dd1_44b8_b6ad_4092f61c1883.slice/crio-2e005faa6f3f55e7b18b9c5d82dcbb37e9f85a7c668afa5cb85ac5eb60712922 WatchSource:0}: Error finding container 2e005faa6f3f55e7b18b9c5d82dcbb37e9f85a7c668afa5cb85ac5eb60712922: Status 404 returned error can't find the container with id 2e005faa6f3f55e7b18b9c5d82dcbb37e9f85a7c668afa5cb85ac5eb60712922 Apr 16 17:43:38.885769 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:38.885720 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9" event={"ID":"bae703ff-9612-493f-8d86-c275eac39802","Type":"ContainerStarted","Data":"2d60381376b07f8db566c8dc8c92c6b1002b922262ad39a6a445db7329f19404"} Apr 16 17:43:38.886880 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:38.886848 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" event={"ID":"7c31261b-9dd1-44b8-b6ad-4092f61c1883","Type":"ContainerStarted","Data":"2e005faa6f3f55e7b18b9c5d82dcbb37e9f85a7c668afa5cb85ac5eb60712922"} Apr 16 17:43:40.892400 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:40.892375 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/0.log" Apr 16 17:43:40.892770 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:40.892411 2570 generic.go:358] "Generic (PLEG): container finished" podID="7c31261b-9dd1-44b8-b6ad-4092f61c1883" containerID="764c885736c7e3344060aef32c06bbd97967ddb0356b96c3bb7ccd864700ac4c" exitCode=255 Apr 16 17:43:40.892770 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:40.892524 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" event={"ID":"7c31261b-9dd1-44b8-b6ad-4092f61c1883","Type":"ContainerDied","Data":"764c885736c7e3344060aef32c06bbd97967ddb0356b96c3bb7ccd864700ac4c"} Apr 16 17:43:40.892770 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:40.892678 2570 scope.go:117] "RemoveContainer" containerID="764c885736c7e3344060aef32c06bbd97967ddb0356b96c3bb7ccd864700ac4c" Apr 16 17:43:40.893712 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:40.893692 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9" event={"ID":"bae703ff-9612-493f-8d86-c275eac39802","Type":"ContainerStarted","Data":"db14450b919bb77a67aaa06e3c751de71ba80cd9764202144ba51fe90b60a4d6"} Apr 16 17:43:40.931365 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:40.931330 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zqvq9" podStartSLOduration=1.9657663159999998 podStartE2EDuration="3.931318812s" podCreationTimestamp="2026-04-16 17:43:37 +0000 UTC" firstStartedPulling="2026-04-16 17:43:38.046352472 +0000 UTC m=+133.126573578" lastFinishedPulling="2026-04-16 17:43:40.011904953 +0000 UTC m=+135.092126074" observedRunningTime="2026-04-16 17:43:40.930375044 +0000 UTC m=+136.010596173" watchObservedRunningTime="2026-04-16 17:43:40.931318812 +0000 UTC m=+136.011539941" Apr 16 17:43:41.896950 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.896928 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/1.log" Apr 16 17:43:41.897330 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.897316 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/0.log" Apr 16 17:43:41.897384 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.897351 2570 generic.go:358] "Generic (PLEG): container finished" podID="7c31261b-9dd1-44b8-b6ad-4092f61c1883" containerID="ea7c998da089c6396f131f1291efb36fdb11ed14958a2c5b38f622d46a9d015d" exitCode=255 Apr 16 17:43:41.897547 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.897446 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" event={"ID":"7c31261b-9dd1-44b8-b6ad-4092f61c1883","Type":"ContainerDied","Data":"ea7c998da089c6396f131f1291efb36fdb11ed14958a2c5b38f622d46a9d015d"} Apr 16 17:43:41.897547 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.897539 2570 scope.go:117] "RemoveContainer" containerID="764c885736c7e3344060aef32c06bbd97967ddb0356b96c3bb7ccd864700ac4c" Apr 16 17:43:41.897753 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.897667 2570 scope.go:117] "RemoveContainer" containerID="ea7c998da089c6396f131f1291efb36fdb11ed14958a2c5b38f622d46a9d015d" Apr 16 17:43:41.897915 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:41.897874 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4977r_openshift-console-operator(7c31261b-9dd1-44b8-b6ad-4092f61c1883)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" podUID="7c31261b-9dd1-44b8-b6ad-4092f61c1883" Apr 16 17:43:41.985313 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.985292 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m"] Apr 16 17:43:41.988221 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.988209 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m" Apr 16 17:43:41.990781 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.990763 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 17:43:41.990888 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.990806 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 17:43:41.990949 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.990927 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-bs9tc\"" Apr 16 17:43:41.997907 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:41.997887 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m"] Apr 16 17:43:42.093523 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:42.093482 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkdk6\" (UniqueName: \"kubernetes.io/projected/203a89e4-4618-4411-aa1c-395cb2b30306-kube-api-access-qkdk6\") pod \"migrator-64d4d94569-9b68m\" (UID: \"203a89e4-4618-4411-aa1c-395cb2b30306\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m" Apr 16 17:43:42.194044 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:42.193993 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkdk6\" (UniqueName: \"kubernetes.io/projected/203a89e4-4618-4411-aa1c-395cb2b30306-kube-api-access-qkdk6\") pod \"migrator-64d4d94569-9b68m\" (UID: \"203a89e4-4618-4411-aa1c-395cb2b30306\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m" Apr 16 17:43:42.201431 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:42.201411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkdk6\" (UniqueName: \"kubernetes.io/projected/203a89e4-4618-4411-aa1c-395cb2b30306-kube-api-access-qkdk6\") pod \"migrator-64d4d94569-9b68m\" (UID: \"203a89e4-4618-4411-aa1c-395cb2b30306\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m" Apr 16 17:43:42.296214 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:42.296195 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m" Apr 16 17:43:42.409250 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:42.409222 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m"] Apr 16 17:43:42.412017 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:43:42.411988 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203a89e4_4618_4411_aa1c_395cb2b30306.slice/crio-9d0273e8585590c30a27aa31aba21f03bafafc74ad4d9ddb86ce32ed1d9dda80 WatchSource:0}: Error finding container 9d0273e8585590c30a27aa31aba21f03bafafc74ad4d9ddb86ce32ed1d9dda80: Status 404 returned error can't find the container with id 9d0273e8585590c30a27aa31aba21f03bafafc74ad4d9ddb86ce32ed1d9dda80 Apr 16 17:43:42.900740 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:42.900711 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/1.log" Apr 16 17:43:42.901152 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:42.901094 2570 scope.go:117] "RemoveContainer" containerID="ea7c998da089c6396f131f1291efb36fdb11ed14958a2c5b38f622d46a9d015d" Apr 16 17:43:42.901350 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:42.901309 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4977r_openshift-console-operator(7c31261b-9dd1-44b8-b6ad-4092f61c1883)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" podUID="7c31261b-9dd1-44b8-b6ad-4092f61c1883" Apr 16 17:43:42.902049 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:42.902027 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m" event={"ID":"203a89e4-4618-4411-aa1c-395cb2b30306","Type":"ContainerStarted","Data":"9d0273e8585590c30a27aa31aba21f03bafafc74ad4d9ddb86ce32ed1d9dda80"} Apr 16 17:43:43.905609 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:43.905580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m" event={"ID":"203a89e4-4618-4411-aa1c-395cb2b30306","Type":"ContainerStarted","Data":"bc91f86091db1170db1e9f17cdff6bc82c7fb31ed7f41c10db389f2e29aa9dac"} Apr 16 17:43:43.905609 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:43.905610 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m" event={"ID":"203a89e4-4618-4411-aa1c-395cb2b30306","Type":"ContainerStarted","Data":"ba9a3bc27ef8698cce0fea8e0ecdf96b2e7a3c72f574243958cf041633f894bc"} Apr 16 17:43:43.931628 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:43.931587 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-9b68m" podStartSLOduration=1.962259942 podStartE2EDuration="2.931576662s" podCreationTimestamp="2026-04-16 17:43:41 +0000 UTC" firstStartedPulling="2026-04-16 17:43:42.41379472 +0000 UTC m=+137.494015830" lastFinishedPulling="2026-04-16 17:43:43.383111429 +0000 UTC m=+138.463332550" observedRunningTime="2026-04-16 17:43:43.931096099 +0000 UTC m=+139.011317229" watchObservedRunningTime="2026-04-16 17:43:43.931576662 +0000 UTC m=+139.011797826" Apr 16 17:43:44.417703 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.417674 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2bgq6"] Apr 16 17:43:44.420666 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.420649 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.423143 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.423112 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 17:43:44.423303 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.423279 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 17:43:44.423400 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.423328 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 17:43:44.423400 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.423390 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 17:43:44.424640 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.424624 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-48zkm\"" Apr 16 17:43:44.430888 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.430871 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2bgq6"] Apr 16 17:43:44.510561 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.510533 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d9f7e926-aa6c-4880-8031-bdee5bf28608-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.510652 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.510570 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.510652 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.510589 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kbwt\" (UniqueName: \"kubernetes.io/projected/d9f7e926-aa6c-4880-8031-bdee5bf28608-kube-api-access-8kbwt\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.510652 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.510635 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d9f7e926-aa6c-4880-8031-bdee5bf28608-crio-socket\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.510752 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.510663 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9f7e926-aa6c-4880-8031-bdee5bf28608-data-volume\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.611415 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.611394 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9f7e926-aa6c-4880-8031-bdee5bf28608-data-volume\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.611480 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.611451 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d9f7e926-aa6c-4880-8031-bdee5bf28608-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.611480 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.611474 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.611605 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.611490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kbwt\" (UniqueName: \"kubernetes.io/projected/d9f7e926-aa6c-4880-8031-bdee5bf28608-kube-api-access-8kbwt\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.611605 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:44.611585 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:44.611697 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.611624 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d9f7e926-aa6c-4880-8031-bdee5bf28608-crio-socket\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.611697 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:44.611657 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls podName:d9f7e926-aa6c-4880-8031-bdee5bf28608 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:45.111638533 +0000 UTC m=+140.191859645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2bgq6" (UID: "d9f7e926-aa6c-4880-8031-bdee5bf28608") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:44.611697 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.611683 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d9f7e926-aa6c-4880-8031-bdee5bf28608-crio-socket\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.611847 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.611781 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9f7e926-aa6c-4880-8031-bdee5bf28608-data-volume\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.612031 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.612013 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d9f7e926-aa6c-4880-8031-bdee5bf28608-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:44.620297 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:44.620271 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kbwt\" (UniqueName: \"kubernetes.io/projected/d9f7e926-aa6c-4880-8031-bdee5bf28608-kube-api-access-8kbwt\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:45.001736 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.001716 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kdbhj"] Apr 16 17:43:45.004661 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.004647 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.012873 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.012849 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 17:43:45.012961 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.012881 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 17:43:45.012961 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.012902 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-h9k5c\"" Apr 16 17:43:45.012961 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.012922 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 17:43:45.014022 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.013948 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 17:43:45.014768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.014744 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kdbhj"] Apr 16 17:43:45.114428 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.114409 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bc20e63e-f046-4ac5-bbdd-a277639258f6-signing-key\") pod \"service-ca-bfc587fb7-kdbhj\" (UID: \"bc20e63e-f046-4ac5-bbdd-a277639258f6\") " pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.114502 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.114436 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj4rp\" (UniqueName: \"kubernetes.io/projected/bc20e63e-f046-4ac5-bbdd-a277639258f6-kube-api-access-kj4rp\") pod \"service-ca-bfc587fb7-kdbhj\" (UID: \"bc20e63e-f046-4ac5-bbdd-a277639258f6\") " pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.114502 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.114490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:45.114605 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.114524 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bc20e63e-f046-4ac5-bbdd-a277639258f6-signing-cabundle\") pod \"service-ca-bfc587fb7-kdbhj\" (UID: \"bc20e63e-f046-4ac5-bbdd-a277639258f6\") " pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.114643 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:45.114602 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:45.114676 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:45.114654 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls podName:d9f7e926-aa6c-4880-8031-bdee5bf28608 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:46.114637705 +0000 UTC m=+141.194858815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2bgq6" (UID: "d9f7e926-aa6c-4880-8031-bdee5bf28608") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:45.215125 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.215100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bc20e63e-f046-4ac5-bbdd-a277639258f6-signing-cabundle\") pod \"service-ca-bfc587fb7-kdbhj\" (UID: \"bc20e63e-f046-4ac5-bbdd-a277639258f6\") " pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.215221 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.215149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bc20e63e-f046-4ac5-bbdd-a277639258f6-signing-key\") pod \"service-ca-bfc587fb7-kdbhj\" (UID: \"bc20e63e-f046-4ac5-bbdd-a277639258f6\") " pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.215221 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.215181 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj4rp\" (UniqueName: \"kubernetes.io/projected/bc20e63e-f046-4ac5-bbdd-a277639258f6-kube-api-access-kj4rp\") pod \"service-ca-bfc587fb7-kdbhj\" (UID: \"bc20e63e-f046-4ac5-bbdd-a277639258f6\") " pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.215681 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.215665 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bc20e63e-f046-4ac5-bbdd-a277639258f6-signing-cabundle\") pod \"service-ca-bfc587fb7-kdbhj\" (UID: \"bc20e63e-f046-4ac5-bbdd-a277639258f6\") " pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.217377 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.217362 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bc20e63e-f046-4ac5-bbdd-a277639258f6-signing-key\") pod \"service-ca-bfc587fb7-kdbhj\" (UID: \"bc20e63e-f046-4ac5-bbdd-a277639258f6\") " pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.223281 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.223259 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj4rp\" (UniqueName: \"kubernetes.io/projected/bc20e63e-f046-4ac5-bbdd-a277639258f6-kube-api-access-kj4rp\") pod \"service-ca-bfc587fb7-kdbhj\" (UID: \"bc20e63e-f046-4ac5-bbdd-a277639258f6\") " pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.313229 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.313213 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" Apr 16 17:43:45.419868 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.419840 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kdbhj"] Apr 16 17:43:45.422665 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:43:45.422642 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc20e63e_f046_4ac5_bbdd_a277639258f6.slice/crio-dd8df9378b62d1a9b9e738e3b23feb6d7760a637c7a3a95bebdd9874e38ee16c WatchSource:0}: Error finding container dd8df9378b62d1a9b9e738e3b23feb6d7760a637c7a3a95bebdd9874e38ee16c: Status 404 returned error can't find the container with id dd8df9378b62d1a9b9e738e3b23feb6d7760a637c7a3a95bebdd9874e38ee16c Apr 16 17:43:45.911522 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:45.911468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" event={"ID":"bc20e63e-f046-4ac5-bbdd-a277639258f6","Type":"ContainerStarted","Data":"dd8df9378b62d1a9b9e738e3b23feb6d7760a637c7a3a95bebdd9874e38ee16c"} Apr 16 17:43:46.122004 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:46.121974 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:46.122424 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:46.122097 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:46.122424 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:46.122154 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls podName:d9f7e926-aa6c-4880-8031-bdee5bf28608 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:48.122136899 +0000 UTC m=+143.202358027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2bgq6" (UID: "d9f7e926-aa6c-4880-8031-bdee5bf28608") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:47.918117 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:47.918082 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" event={"ID":"bc20e63e-f046-4ac5-bbdd-a277639258f6","Type":"ContainerStarted","Data":"02495231b30abccb7a7ece42c25b847b7cc4cbf9efec1dfa5baa72d9861147e9"} Apr 16 17:43:47.939193 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:47.939149 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-kdbhj" podStartSLOduration=2.308643957 podStartE2EDuration="3.939135829s" podCreationTimestamp="2026-04-16 17:43:44 +0000 UTC" firstStartedPulling="2026-04-16 17:43:45.424490877 +0000 UTC m=+140.504711984" lastFinishedPulling="2026-04-16 17:43:47.054982736 +0000 UTC m=+142.135203856" observedRunningTime="2026-04-16 17:43:47.938500342 +0000 UTC m=+143.018721471" watchObservedRunningTime="2026-04-16 17:43:47.939135829 +0000 UTC m=+143.019356961" Apr 16 17:43:48.037720 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:48.037698 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:48.037720 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:48.037722 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:43:48.038018 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:48.038006 2570 scope.go:117] "RemoveContainer" containerID="ea7c998da089c6396f131f1291efb36fdb11ed14958a2c5b38f622d46a9d015d" Apr 16 17:43:48.038152 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:48.038137 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4977r_openshift-console-operator(7c31261b-9dd1-44b8-b6ad-4092f61c1883)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" podUID="7c31261b-9dd1-44b8-b6ad-4092f61c1883" Apr 16 17:43:48.136060 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:48.136032 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:48.136184 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:48.136142 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:48.136256 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:48.136209 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls podName:d9f7e926-aa6c-4880-8031-bdee5bf28608 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:52.136191907 +0000 UTC m=+147.216413020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2bgq6" (UID: "d9f7e926-aa6c-4880-8031-bdee5bf28608") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:52.162941 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:52.162913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:43:52.163277 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:52.163059 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:52.163277 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:52.163125 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls podName:d9f7e926-aa6c-4880-8031-bdee5bf28608 nodeName:}" failed. No retries permitted until 2026-04-16 17:44:00.163104797 +0000 UTC m=+155.243325903 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2bgq6" (UID: "d9f7e926-aa6c-4880-8031-bdee5bf28608") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:59.524903 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:59.524875 2570 scope.go:117] "RemoveContainer" containerID="ea7c998da089c6396f131f1291efb36fdb11ed14958a2c5b38f622d46a9d015d" Apr 16 17:43:59.949204 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:59.949177 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 17:43:59.949546 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:59.949533 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/1.log" Apr 16 17:43:59.949599 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:59.949566 2570 generic.go:358] "Generic (PLEG): container finished" podID="7c31261b-9dd1-44b8-b6ad-4092f61c1883" containerID="c61374e8fc926d699aa70266cd82882ad53566dc9fd02d69dae936aa6d6304f9" exitCode=255 Apr 16 17:43:59.949635 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:59.949618 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" event={"ID":"7c31261b-9dd1-44b8-b6ad-4092f61c1883","Type":"ContainerDied","Data":"c61374e8fc926d699aa70266cd82882ad53566dc9fd02d69dae936aa6d6304f9"} Apr 16 17:43:59.949665 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:59.949656 2570 scope.go:117] "RemoveContainer" containerID="ea7c998da089c6396f131f1291efb36fdb11ed14958a2c5b38f622d46a9d015d" Apr 16 17:43:59.949942 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:43:59.949925 2570 scope.go:117] "RemoveContainer" containerID="c61374e8fc926d699aa70266cd82882ad53566dc9fd02d69dae936aa6d6304f9" Apr 16 17:43:59.950090 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:43:59.950071 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4977r_openshift-console-operator(7c31261b-9dd1-44b8-b6ad-4092f61c1883)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" podUID="7c31261b-9dd1-44b8-b6ad-4092f61c1883" Apr 16 17:44:00.220728 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:00.220679 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:44:00.222681 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:00.222664 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7e926-aa6c-4880-8031-bdee5bf28608-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bgq6\" (UID: \"d9f7e926-aa6c-4880-8031-bdee5bf28608\") " pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:44:00.329355 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:00.329327 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2bgq6" Apr 16 17:44:00.447014 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:00.446987 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2bgq6"] Apr 16 17:44:00.450126 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:44:00.450101 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f7e926_aa6c_4880_8031_bdee5bf28608.slice/crio-d643c52af37079a0948aab36caf92e9dc407397b65b6fcdbb01b9a80388a24ca WatchSource:0}: Error finding container d643c52af37079a0948aab36caf92e9dc407397b65b6fcdbb01b9a80388a24ca: Status 404 returned error can't find the container with id d643c52af37079a0948aab36caf92e9dc407397b65b6fcdbb01b9a80388a24ca Apr 16 17:44:00.952601 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:00.952576 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 17:44:00.953931 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:00.953908 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2bgq6" event={"ID":"d9f7e926-aa6c-4880-8031-bdee5bf28608","Type":"ContainerStarted","Data":"ce521e92410e239e40ba5e67eb4a9e8ac3cdab1617d0121c2d34e4df759a2167"} Apr 16 17:44:00.954033 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:00.953938 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2bgq6" event={"ID":"d9f7e926-aa6c-4880-8031-bdee5bf28608","Type":"ContainerStarted","Data":"d643c52af37079a0948aab36caf92e9dc407397b65b6fcdbb01b9a80388a24ca"} Apr 16 17:44:01.362803 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:44:01.362769 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-95rfp" podUID="cca91f7c-ca0a-4c7a-97bc-93c7e35d6271" Apr 16 17:44:01.369295 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:44:01.369270 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4ctsq" podUID="64a21724-619d-4a2f-b61f-bf293308211b" Apr 16 17:44:01.542257 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:44:01.542226 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-l7h7z" podUID="38d21ef6-c2df-4bbd-8185-bf4fff5cb835" Apr 16 17:44:01.958246 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:01.958196 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2bgq6" event={"ID":"d9f7e926-aa6c-4880-8031-bdee5bf28608","Type":"ContainerStarted","Data":"b4c888bcdbd903f4e9765bcec248eb5ad5d411f91f87e94682cee52ce23ae448"} Apr 16 17:44:01.958246 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:01.958224 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:44:01.958715 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:01.958423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-95rfp" Apr 16 17:44:02.962150 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:02.962061 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2bgq6" event={"ID":"d9f7e926-aa6c-4880-8031-bdee5bf28608","Type":"ContainerStarted","Data":"fa49f669e9acf177161331320e7de6ad89ce7cd0db625ad680f9df01085f2e07"} Apr 16 17:44:02.989737 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:02.989694 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2bgq6" podStartSLOduration=16.80977811 podStartE2EDuration="18.989681334s" podCreationTimestamp="2026-04-16 17:43:44 +0000 UTC" firstStartedPulling="2026-04-16 17:44:00.507490985 +0000 UTC m=+155.587712092" lastFinishedPulling="2026-04-16 17:44:02.687394197 +0000 UTC m=+157.767615316" observedRunningTime="2026-04-16 17:44:02.987771886 +0000 UTC m=+158.067993016" watchObservedRunningTime="2026-04-16 17:44:02.989681334 +0000 UTC m=+158.069902457" Apr 16 17:44:05.994291 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:05.994260 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6cb5b774b5-jf8zl"] Apr 16 17:44:05.998865 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:05.998842 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.001635 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.001617 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 17:44:06.001761 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.001733 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vf8ds\"" Apr 16 17:44:06.002959 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.002938 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 17:44:06.003030 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.002944 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 17:44:06.006878 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.006861 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 17:44:06.010580 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.010560 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cb5b774b5-jf8zl"] Apr 16 17:44:06.159405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.159380 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/640fee70-6120-4a2c-9c86-bec1973f853e-registry-certificates\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.159522 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.159407 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/640fee70-6120-4a2c-9c86-bec1973f853e-installation-pull-secrets\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.159522 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.159443 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/640fee70-6120-4a2c-9c86-bec1973f853e-registry-tls\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.159616 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.159501 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/640fee70-6120-4a2c-9c86-bec1973f853e-bound-sa-token\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.159616 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.159572 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/640fee70-6120-4a2c-9c86-bec1973f853e-image-registry-private-configuration\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.159690 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.159637 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/640fee70-6120-4a2c-9c86-bec1973f853e-ca-trust-extracted\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.159690 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.159668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/640fee70-6120-4a2c-9c86-bec1973f853e-trusted-ca\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.159690 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.159684 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52jm\" (UniqueName: \"kubernetes.io/projected/640fee70-6120-4a2c-9c86-bec1973f853e-kube-api-access-d52jm\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.260726 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.260673 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/640fee70-6120-4a2c-9c86-bec1973f853e-trusted-ca\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.260726 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.260700 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d52jm\" (UniqueName: \"kubernetes.io/projected/640fee70-6120-4a2c-9c86-bec1973f853e-kube-api-access-d52jm\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.260726 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.260724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/640fee70-6120-4a2c-9c86-bec1973f853e-registry-certificates\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.260881 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.260742 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/640fee70-6120-4a2c-9c86-bec1973f853e-installation-pull-secrets\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.260881 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.260777 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/640fee70-6120-4a2c-9c86-bec1973f853e-registry-tls\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.260881 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.260805 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/640fee70-6120-4a2c-9c86-bec1973f853e-bound-sa-token\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.260881 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.260838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/640fee70-6120-4a2c-9c86-bec1973f853e-image-registry-private-configuration\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.261067 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.260892 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/640fee70-6120-4a2c-9c86-bec1973f853e-ca-trust-extracted\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.261299 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.261274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/640fee70-6120-4a2c-9c86-bec1973f853e-ca-trust-extracted\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.261636 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.261609 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/640fee70-6120-4a2c-9c86-bec1973f853e-registry-certificates\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.261775 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.261753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/640fee70-6120-4a2c-9c86-bec1973f853e-trusted-ca\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.263351 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.263318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/640fee70-6120-4a2c-9c86-bec1973f853e-image-registry-private-configuration\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.263429 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.263333 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/640fee70-6120-4a2c-9c86-bec1973f853e-installation-pull-secrets\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.263708 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.263692 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/640fee70-6120-4a2c-9c86-bec1973f853e-registry-tls\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.268247 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.268229 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52jm\" (UniqueName: \"kubernetes.io/projected/640fee70-6120-4a2c-9c86-bec1973f853e-kube-api-access-d52jm\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.268355 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.268291 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/640fee70-6120-4a2c-9c86-bec1973f853e-bound-sa-token\") pod \"image-registry-6cb5b774b5-jf8zl\" (UID: \"640fee70-6120-4a2c-9c86-bec1973f853e\") " pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.307538 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.307487 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.362103 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.362075 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:44:06.362223 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.362128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:44:06.364359 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.364335 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a21724-619d-4a2f-b61f-bf293308211b-cert\") pod \"ingress-canary-4ctsq\" (UID: \"64a21724-619d-4a2f-b61f-bf293308211b\") " pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:44:06.364442 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.364370 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cca91f7c-ca0a-4c7a-97bc-93c7e35d6271-metrics-tls\") pod \"dns-default-95rfp\" (UID: \"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271\") " pod="openshift-dns/dns-default-95rfp" Apr 16 17:44:06.461724 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.461698 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2vlxb\"" Apr 16 17:44:06.461724 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.461698 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w8phs\"" Apr 16 17:44:06.469442 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.469421 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-95rfp" Apr 16 17:44:06.469578 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.469486 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4ctsq" Apr 16 17:44:06.591712 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.591682 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4ctsq"] Apr 16 17:44:06.595284 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:44:06.595253 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64a21724_619d_4a2f_b61f_bf293308211b.slice/crio-77cea9953cdcb73e1c03bd3bf256c480ac24f91107da7dd4174ca9280c5bc681 WatchSource:0}: Error finding container 77cea9953cdcb73e1c03bd3bf256c480ac24f91107da7dd4174ca9280c5bc681: Status 404 returned error can't find the container with id 77cea9953cdcb73e1c03bd3bf256c480ac24f91107da7dd4174ca9280c5bc681 Apr 16 17:44:06.609291 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.609270 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-95rfp"] Apr 16 17:44:06.611959 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:44:06.611934 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcca91f7c_ca0a_4c7a_97bc_93c7e35d6271.slice/crio-da9451ab1c64bb10aac8bd25ca3d088223c18b047294dabcca349c3f9f1481b4 WatchSource:0}: Error finding container da9451ab1c64bb10aac8bd25ca3d088223c18b047294dabcca349c3f9f1481b4: Status 404 returned error can't find the container with id da9451ab1c64bb10aac8bd25ca3d088223c18b047294dabcca349c3f9f1481b4 Apr 16 17:44:06.646092 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.646072 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cb5b774b5-jf8zl"] Apr 16 17:44:06.648903 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:44:06.648875 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod640fee70_6120_4a2c_9c86_bec1973f853e.slice/crio-cc90fe7c9be3b8738a11c9a05119c7e98a6c7ec7bcf41081a6fbe90d6257aafe WatchSource:0}: Error finding container cc90fe7c9be3b8738a11c9a05119c7e98a6c7ec7bcf41081a6fbe90d6257aafe: Status 404 returned error can't find the container with id cc90fe7c9be3b8738a11c9a05119c7e98a6c7ec7bcf41081a6fbe90d6257aafe Apr 16 17:44:06.971695 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.971662 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-95rfp" event={"ID":"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271","Type":"ContainerStarted","Data":"da9451ab1c64bb10aac8bd25ca3d088223c18b047294dabcca349c3f9f1481b4"} Apr 16 17:44:06.973226 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.973195 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4ctsq" event={"ID":"64a21724-619d-4a2f-b61f-bf293308211b","Type":"ContainerStarted","Data":"77cea9953cdcb73e1c03bd3bf256c480ac24f91107da7dd4174ca9280c5bc681"} Apr 16 17:44:06.974863 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.974838 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" event={"ID":"640fee70-6120-4a2c-9c86-bec1973f853e","Type":"ContainerStarted","Data":"7e589f3c0d4b0de1159a517b3ed63c35e6d321a2d79781c377885d40a7ae5da0"} Apr 16 17:44:06.974994 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.974869 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" event={"ID":"640fee70-6120-4a2c-9c86-bec1973f853e","Type":"ContainerStarted","Data":"cc90fe7c9be3b8738a11c9a05119c7e98a6c7ec7bcf41081a6fbe90d6257aafe"} Apr 16 17:44:06.975061 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.974999 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:06.995943 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:06.995896 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" podStartSLOduration=1.995877218 podStartE2EDuration="1.995877218s" podCreationTimestamp="2026-04-16 17:44:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:44:06.994252635 +0000 UTC m=+162.074473973" watchObservedRunningTime="2026-04-16 17:44:06.995877218 +0000 UTC m=+162.076098347" Apr 16 17:44:08.037224 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:08.037195 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:44:08.037224 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:08.037230 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:44:08.037707 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:08.037631 2570 scope.go:117] "RemoveContainer" containerID="c61374e8fc926d699aa70266cd82882ad53566dc9fd02d69dae936aa6d6304f9" Apr 16 17:44:08.037856 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:44:08.037835 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4977r_openshift-console-operator(7c31261b-9dd1-44b8-b6ad-4092f61c1883)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" podUID="7c31261b-9dd1-44b8-b6ad-4092f61c1883" Apr 16 17:44:08.982151 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:08.982122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-95rfp" event={"ID":"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271","Type":"ContainerStarted","Data":"41acaae566a7279f05ae50595bedb318a09f5c75160bc682aa8720980607c6d0"} Apr 16 17:44:08.982282 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:08.982156 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-95rfp" event={"ID":"cca91f7c-ca0a-4c7a-97bc-93c7e35d6271","Type":"ContainerStarted","Data":"19028426aef6f9a739c5415554dcf8cdbef01aea2430f406d763b2ed8678f50d"} Apr 16 17:44:08.982282 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:08.982266 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-95rfp" Apr 16 17:44:08.983460 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:08.983441 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4ctsq" event={"ID":"64a21724-619d-4a2f-b61f-bf293308211b","Type":"ContainerStarted","Data":"627ed8da715736fca099d6097adc76cd36814f13f1705fb25f51bba8600692a5"} Apr 16 17:44:09.000032 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:08.999996 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-95rfp" podStartSLOduration=129.225844555 podStartE2EDuration="2m10.999985057s" podCreationTimestamp="2026-04-16 17:41:58 +0000 UTC" firstStartedPulling="2026-04-16 17:44:06.613897812 +0000 UTC m=+161.694118919" lastFinishedPulling="2026-04-16 17:44:08.388038312 +0000 UTC m=+163.468259421" observedRunningTime="2026-04-16 17:44:08.999685715 +0000 UTC m=+164.079906845" watchObservedRunningTime="2026-04-16 17:44:08.999985057 +0000 UTC m=+164.080206186" Apr 16 17:44:09.016877 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:09.016843 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4ctsq" podStartSLOduration=129.222609456 podStartE2EDuration="2m11.016832676s" podCreationTimestamp="2026-04-16 17:41:58 +0000 UTC" firstStartedPulling="2026-04-16 17:44:06.597043746 +0000 UTC m=+161.677264857" lastFinishedPulling="2026-04-16 17:44:08.39126695 +0000 UTC m=+163.471488077" observedRunningTime="2026-04-16 17:44:09.016123359 +0000 UTC m=+164.096344489" watchObservedRunningTime="2026-04-16 17:44:09.016832676 +0000 UTC m=+164.097053805" Apr 16 17:44:12.456405 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.456364 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jlmqv"] Apr 16 17:44:12.460700 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.460684 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.463578 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.463551 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 17:44:12.463578 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.463568 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 17:44:12.463744 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.463589 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gp5hh\"" Apr 16 17:44:12.464694 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.464655 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 17:44:12.464694 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.464682 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 17:44:12.464865 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.464717 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 17:44:12.464865 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.464663 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 17:44:12.609141 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.609110 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-accelerators-collector-config\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.609235 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.609158 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00e7d4de-424d-4342-802b-f7c98a15bf8b-metrics-client-ca\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.609271 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.609238 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.609319 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.609270 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-wtmp\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.609319 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.609301 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/00e7d4de-424d-4342-802b-f7c98a15bf8b-root\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.609399 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.609343 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r8rf\" (UniqueName: \"kubernetes.io/projected/00e7d4de-424d-4342-802b-f7c98a15bf8b-kube-api-access-4r8rf\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.609446 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.609404 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-textfile\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.609446 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.609429 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00e7d4de-424d-4342-802b-f7c98a15bf8b-sys\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.609534 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.609452 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-tls\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.709935 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.709882 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-accelerators-collector-config\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.709935 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.709912 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00e7d4de-424d-4342-802b-f7c98a15bf8b-metrics-client-ca\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710063 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.709956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710063 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.709982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-wtmp\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710063 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710011 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/00e7d4de-424d-4342-802b-f7c98a15bf8b-root\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710063 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710034 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4r8rf\" (UniqueName: \"kubernetes.io/projected/00e7d4de-424d-4342-802b-f7c98a15bf8b-kube-api-access-4r8rf\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710238 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-textfile\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710238 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00e7d4de-424d-4342-802b-f7c98a15bf8b-sys\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710238 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-tls\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710238 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710196 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00e7d4de-424d-4342-802b-f7c98a15bf8b-sys\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710427 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:44:12.710240 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 17:44:12.710427 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:44:12.710296 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-tls podName:00e7d4de-424d-4342-802b-f7c98a15bf8b nodeName:}" failed. No retries permitted until 2026-04-16 17:44:13.210278227 +0000 UTC m=+168.290499337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-tls") pod "node-exporter-jlmqv" (UID: "00e7d4de-424d-4342-802b-f7c98a15bf8b") : secret "node-exporter-tls" not found Apr 16 17:44:12.710546 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710453 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-textfile\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710546 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-accelerators-collector-config\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710546 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710527 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-wtmp\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710546 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710535 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/00e7d4de-424d-4342-802b-f7c98a15bf8b-root\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.710890 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.710875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00e7d4de-424d-4342-802b-f7c98a15bf8b-metrics-client-ca\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.712404 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.712388 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:12.718474 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:12.718453 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r8rf\" (UniqueName: \"kubernetes.io/projected/00e7d4de-424d-4342-802b-f7c98a15bf8b-kube-api-access-4r8rf\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:13.213381 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:13.213347 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-tls\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:13.215298 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:13.215282 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/00e7d4de-424d-4342-802b-f7c98a15bf8b-node-exporter-tls\") pod \"node-exporter-jlmqv\" (UID: \"00e7d4de-424d-4342-802b-f7c98a15bf8b\") " pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:13.371301 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:13.371252 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jlmqv" Apr 16 17:44:13.379195 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:44:13.379174 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00e7d4de_424d_4342_802b_f7c98a15bf8b.slice/crio-e9282347d93241cc630db9699cd6ca261d5a070f6ebca076756cc3b766579658 WatchSource:0}: Error finding container e9282347d93241cc630db9699cd6ca261d5a070f6ebca076756cc3b766579658: Status 404 returned error can't find the container with id e9282347d93241cc630db9699cd6ca261d5a070f6ebca076756cc3b766579658 Apr 16 17:44:13.742090 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:13.742010 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" podUID="9706163e-8fe9-4ea7-9c1e-2114a03c13f2" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.8:8000/readyz\": dial tcp 10.134.0.8:8000: connect: connection refused" Apr 16 17:44:14.000709 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:14.000642 2570 generic.go:358] "Generic (PLEG): container finished" podID="9706163e-8fe9-4ea7-9c1e-2114a03c13f2" containerID="2e6e3cfe87f88f1d9833e6f0fce4537321bd96dbfd936cef40fd4235344511ab" exitCode=1 Apr 16 17:44:14.000863 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:14.000722 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" event={"ID":"9706163e-8fe9-4ea7-9c1e-2114a03c13f2","Type":"ContainerDied","Data":"2e6e3cfe87f88f1d9833e6f0fce4537321bd96dbfd936cef40fd4235344511ab"} Apr 16 17:44:14.001088 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:14.001073 2570 scope.go:117] "RemoveContainer" containerID="2e6e3cfe87f88f1d9833e6f0fce4537321bd96dbfd936cef40fd4235344511ab" Apr 16 17:44:14.001660 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:14.001628 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jlmqv" event={"ID":"00e7d4de-424d-4342-802b-f7c98a15bf8b","Type":"ContainerStarted","Data":"e9282347d93241cc630db9699cd6ca261d5a070f6ebca076756cc3b766579658"} Apr 16 17:44:14.004223 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:14.003836 2570 generic.go:358] "Generic (PLEG): container finished" podID="871088ca-9a0a-4901-b6b4-8f72389ab255" containerID="1021f475d4c0ea48eb4f335659697f69743a8ae96cadbdd094180747791a1405" exitCode=255 Apr 16 17:44:14.004223 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:14.003868 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" event={"ID":"871088ca-9a0a-4901-b6b4-8f72389ab255","Type":"ContainerDied","Data":"1021f475d4c0ea48eb4f335659697f69743a8ae96cadbdd094180747791a1405"} Apr 16 17:44:14.010715 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:14.010697 2570 scope.go:117] "RemoveContainer" containerID="1021f475d4c0ea48eb4f335659697f69743a8ae96cadbdd094180747791a1405" Apr 16 17:44:15.008185 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:15.008159 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5954cd7dff-kxbzd" event={"ID":"871088ca-9a0a-4901-b6b4-8f72389ab255","Type":"ContainerStarted","Data":"e168a22176aafe933b1db918c957fbdee18f3ca616cacb1425854793e236817e"} Apr 16 17:44:15.009726 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:15.009699 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" event={"ID":"9706163e-8fe9-4ea7-9c1e-2114a03c13f2","Type":"ContainerStarted","Data":"b597023b95da202c956e4c3567b3c0240b958d54e245206ba94066612cf2fcf9"} Apr 16 17:44:15.009980 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:15.009967 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:44:15.010449 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:15.010433 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-659648565c-cvtzt" Apr 16 17:44:15.011095 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:15.011076 2570 generic.go:358] "Generic (PLEG): container finished" podID="00e7d4de-424d-4342-802b-f7c98a15bf8b" containerID="c9d9773975cdcc7c5a1f943f14646f9f599f8cd9d2f1a69e6271078e1e2952dd" exitCode=0 Apr 16 17:44:15.011205 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:15.011104 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jlmqv" event={"ID":"00e7d4de-424d-4342-802b-f7c98a15bf8b","Type":"ContainerDied","Data":"c9d9773975cdcc7c5a1f943f14646f9f599f8cd9d2f1a69e6271078e1e2952dd"} Apr 16 17:44:15.526829 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:15.526800 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:44:16.015567 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:16.015532 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jlmqv" event={"ID":"00e7d4de-424d-4342-802b-f7c98a15bf8b","Type":"ContainerStarted","Data":"44cf9c6c1eca93012970d964263cfa379f19af7669c52418634a4d339395f2f7"} Apr 16 17:44:16.015567 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:16.015568 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jlmqv" event={"ID":"00e7d4de-424d-4342-802b-f7c98a15bf8b","Type":"ContainerStarted","Data":"543b0e16e9754bb292fafc78ab2caacdaa99f9953575cc3e6628d6985b62c686"} Apr 16 17:44:16.040665 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:16.040623 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jlmqv" podStartSLOduration=3.368150726 podStartE2EDuration="4.040610301s" podCreationTimestamp="2026-04-16 17:44:12 +0000 UTC" firstStartedPulling="2026-04-16 17:44:13.380814556 +0000 UTC m=+168.461035664" lastFinishedPulling="2026-04-16 17:44:14.053274132 +0000 UTC m=+169.133495239" observedRunningTime="2026-04-16 17:44:16.039287408 +0000 UTC m=+171.119508537" watchObservedRunningTime="2026-04-16 17:44:16.040610301 +0000 UTC m=+171.120831429" Apr 16 17:44:17.648674 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.648643 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5f57875499-n9xmm"] Apr 16 17:44:17.651983 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.651969 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.654621 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.654595 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 17:44:17.654735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.654674 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 17:44:17.654735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.654678 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 17:44:17.654735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.654694 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-mmvlc\"" Apr 16 17:44:17.654735 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.654721 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 17:44:17.658940 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.655262 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 17:44:17.662390 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.662369 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 17:44:17.665264 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.665241 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5f57875499-n9xmm"] Apr 16 17:44:17.845688 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.845661 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cc2e311-0c6d-40bc-8b60-2efa8925192f-serving-certs-ca-bundle\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.845784 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.845699 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cc2e311-0c6d-40bc-8b60-2efa8925192f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.845784 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.845756 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.845887 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.845791 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-federate-client-tls\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.845887 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.845808 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx2j6\" (UniqueName: \"kubernetes.io/projected/8cc2e311-0c6d-40bc-8b60-2efa8925192f-kube-api-access-zx2j6\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.845887 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.845870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-telemeter-client-tls\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.846054 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.845902 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-secret-telemeter-client\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.846054 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.845925 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8cc2e311-0c6d-40bc-8b60-2efa8925192f-metrics-client-ca\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.947135 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.947086 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cc2e311-0c6d-40bc-8b60-2efa8925192f-serving-certs-ca-bundle\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.947135 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.947121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cc2e311-0c6d-40bc-8b60-2efa8925192f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.947281 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.947151 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.947281 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.947183 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-federate-client-tls\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.947281 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.947208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx2j6\" (UniqueName: \"kubernetes.io/projected/8cc2e311-0c6d-40bc-8b60-2efa8925192f-kube-api-access-zx2j6\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.947281 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.947244 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-telemeter-client-tls\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.947281 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.947274 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-secret-telemeter-client\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.947534 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.947300 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8cc2e311-0c6d-40bc-8b60-2efa8925192f-metrics-client-ca\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.947910 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.947882 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cc2e311-0c6d-40bc-8b60-2efa8925192f-serving-certs-ca-bundle\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.948078 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.948058 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cc2e311-0c6d-40bc-8b60-2efa8925192f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.948326 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.948304 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8cc2e311-0c6d-40bc-8b60-2efa8925192f-metrics-client-ca\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.949610 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.949586 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.949807 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.949785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-telemeter-client-tls\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.949875 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.949809 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-federate-client-tls\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.949930 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.949901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8cc2e311-0c6d-40bc-8b60-2efa8925192f-secret-telemeter-client\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.956897 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.956877 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx2j6\" (UniqueName: \"kubernetes.io/projected/8cc2e311-0c6d-40bc-8b60-2efa8925192f-kube-api-access-zx2j6\") pod \"telemeter-client-5f57875499-n9xmm\" (UID: \"8cc2e311-0c6d-40bc-8b60-2efa8925192f\") " pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:17.964593 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:17.964575 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" Apr 16 17:44:18.086463 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.086435 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5f57875499-n9xmm"] Apr 16 17:44:18.089484 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:44:18.089457 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc2e311_0c6d_40bc_8b60_2efa8925192f.slice/crio-298e4f0359e09999af8f31dfbb2edb2878c090a3184cf4cea5f2ad7cc71d660b WatchSource:0}: Error finding container 298e4f0359e09999af8f31dfbb2edb2878c090a3184cf4cea5f2ad7cc71d660b: Status 404 returned error can't find the container with id 298e4f0359e09999af8f31dfbb2edb2878c090a3184cf4cea5f2ad7cc71d660b Apr 16 17:44:18.633487 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.633462 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:44:18.638209 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.638194 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.641696 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.641655 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 17:44:18.641696 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.641672 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 17:44:18.643030 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.642873 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 17:44:18.643030 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.642891 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 17:44:18.643030 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.642897 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9lm9q\"" Apr 16 17:44:18.643030 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.642902 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 17:44:18.643030 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.643028 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 17:44:18.643331 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.643145 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 17:44:18.643331 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.643192 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-f1cc48p2uj6v5\"" Apr 16 17:44:18.643615 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.643576 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 17:44:18.643615 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.643587 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 17:44:18.643769 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.643624 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 17:44:18.643769 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.643588 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 17:44:18.643769 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.643657 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 17:44:18.645141 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.645123 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 17:44:18.654432 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.654414 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:44:18.752404 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752380 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752404 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752415 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752618 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752443 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-config\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752618 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752470 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752618 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752515 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752618 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752545 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752618 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752575 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752618 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752600 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752934 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752651 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752934 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76jpl\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-kube-api-access-76jpl\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752934 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752747 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752934 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752772 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752934 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752934 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752831 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752934 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752854 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-web-config\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.752934 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752879 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.753316 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752944 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.753316 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.752989 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-config-out\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854046 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854016 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854154 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854062 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854154 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-web-config\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854154 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854154 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854367 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-config-out\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854367 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854181 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854367 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854213 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854367 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854269 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-config\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854367 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854315 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854367 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854682 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854682 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854449 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854682 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854682 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854478 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854682 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854547 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854682 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854616 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76jpl\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-kube-api-access-76jpl\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.854682 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854656 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.855015 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.854681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.855638 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.855612 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.856978 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.856953 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-config-out\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.857364 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.857231 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-web-config\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.857364 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.857315 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-config\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.857535 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.857446 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.857603 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.857561 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.858094 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.858071 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.858367 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.858316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.858451 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.858431 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.858570 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.858522 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.859070 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.859023 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.859931 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.859907 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.860055 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.860033 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.860742 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.860721 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.860850 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.860837 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.861434 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.861412 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.866722 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.866697 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76jpl\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-kube-api-access-76jpl\") pod \"prometheus-k8s-0\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.948403 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.948332 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:18.988302 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:18.988271 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-95rfp" Apr 16 17:44:19.025368 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:19.025342 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" event={"ID":"8cc2e311-0c6d-40bc-8b60-2efa8925192f","Type":"ContainerStarted","Data":"298e4f0359e09999af8f31dfbb2edb2878c090a3184cf4cea5f2ad7cc71d660b"} Apr 16 17:44:19.091573 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:19.091544 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:44:19.095315 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:44:19.095286 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146f79bd_0caa_448e_9c3a_ad348c633471.slice/crio-945b5c76b6bbb2092ff140b27c39e0f61e9f9965f340b9a90474a93653989d18 WatchSource:0}: Error finding container 945b5c76b6bbb2092ff140b27c39e0f61e9f9965f340b9a90474a93653989d18: Status 404 returned error can't find the container with id 945b5c76b6bbb2092ff140b27c39e0f61e9f9965f340b9a90474a93653989d18 Apr 16 17:44:20.030044 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:20.029968 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" event={"ID":"8cc2e311-0c6d-40bc-8b60-2efa8925192f","Type":"ContainerStarted","Data":"58bb6429107d4b8757e84f529373e9b0204de37bcc97264cf585e90a71c62582"} Apr 16 17:44:20.031139 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:20.031103 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerStarted","Data":"945b5c76b6bbb2092ff140b27c39e0f61e9f9965f340b9a90474a93653989d18"} Apr 16 17:44:21.035737 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:21.035709 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" event={"ID":"8cc2e311-0c6d-40bc-8b60-2efa8925192f","Type":"ContainerStarted","Data":"c10a423aeb31888a264abc85abbe9e0249e51c27f506397b5716dd8db102d91d"} Apr 16 17:44:21.035737 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:21.035741 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" event={"ID":"8cc2e311-0c6d-40bc-8b60-2efa8925192f","Type":"ContainerStarted","Data":"c8d799e06f7fd3821dd0025bdda7b747c0a604d8e2991325277020de0571cdc9"} Apr 16 17:44:21.037014 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:21.036990 2570 generic.go:358] "Generic (PLEG): container finished" podID="146f79bd-0caa-448e-9c3a-ad348c633471" containerID="131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e" exitCode=0 Apr 16 17:44:21.037105 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:21.037038 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerDied","Data":"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e"} Apr 16 17:44:21.067304 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:21.067262 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5f57875499-n9xmm" podStartSLOduration=1.8695097330000001 podStartE2EDuration="4.067249806s" podCreationTimestamp="2026-04-16 17:44:17 +0000 UTC" firstStartedPulling="2026-04-16 17:44:18.091420583 +0000 UTC m=+173.171641691" lastFinishedPulling="2026-04-16 17:44:20.289160651 +0000 UTC m=+175.369381764" observedRunningTime="2026-04-16 17:44:21.065605365 +0000 UTC m=+176.145826493" watchObservedRunningTime="2026-04-16 17:44:21.067249806 +0000 UTC m=+176.147470934" Apr 16 17:44:21.524974 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:21.524950 2570 scope.go:117] "RemoveContainer" containerID="c61374e8fc926d699aa70266cd82882ad53566dc9fd02d69dae936aa6d6304f9" Apr 16 17:44:22.047586 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:22.047557 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 17:44:22.048100 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:22.047752 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" event={"ID":"7c31261b-9dd1-44b8-b6ad-4092f61c1883","Type":"ContainerStarted","Data":"4ce4d5b364236d813eb616a206b6c36d68bd03982ff2d7f887e8b1848d53bca3"} Apr 16 17:44:22.048399 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:22.048380 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:44:22.067866 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:22.067821 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" podStartSLOduration=43.20636881 podStartE2EDuration="45.067806078s" podCreationTimestamp="2026-04-16 17:43:37 +0000 UTC" firstStartedPulling="2026-04-16 17:43:38.155415878 +0000 UTC m=+133.235636984" lastFinishedPulling="2026-04-16 17:43:40.016853144 +0000 UTC m=+135.097074252" observedRunningTime="2026-04-16 17:44:22.067167279 +0000 UTC m=+177.147388403" watchObservedRunningTime="2026-04-16 17:44:22.067806078 +0000 UTC m=+177.148027210" Apr 16 17:44:22.131327 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:22.131278 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4977r" Apr 16 17:44:24.055423 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:24.055390 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerStarted","Data":"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb"} Apr 16 17:44:24.055787 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:24.055429 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerStarted","Data":"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9"} Apr 16 17:44:26.064472 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:26.064438 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerStarted","Data":"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850"} Apr 16 17:44:26.064797 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:26.064478 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerStarted","Data":"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7"} Apr 16 17:44:26.064797 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:26.064495 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerStarted","Data":"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8"} Apr 16 17:44:26.064797 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:26.064533 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerStarted","Data":"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842"} Apr 16 17:44:26.092622 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:26.092582 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.77459343 podStartE2EDuration="8.092569799s" podCreationTimestamp="2026-04-16 17:44:18 +0000 UTC" firstStartedPulling="2026-04-16 17:44:19.097525856 +0000 UTC m=+174.177746977" lastFinishedPulling="2026-04-16 17:44:25.415502238 +0000 UTC m=+180.495723346" observedRunningTime="2026-04-16 17:44:26.092188055 +0000 UTC m=+181.172409226" watchObservedRunningTime="2026-04-16 17:44:26.092569799 +0000 UTC m=+181.172790929" Apr 16 17:44:27.981721 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:27.981698 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6cb5b774b5-jf8zl" Apr 16 17:44:28.948570 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:44:28.948536 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:18.948820 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:18.948791 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:18.967450 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:18.967426 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:19.213592 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:19.213499 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.002483 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.002454 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:45:37.002930 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.002881 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="prometheus" containerID="cri-o://763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" gracePeriod=600 Apr 16 17:45:37.003005 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.002928 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="config-reloader" containerID="cri-o://134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" gracePeriod=600 Apr 16 17:45:37.003005 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.002936 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy-thanos" containerID="cri-o://e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" gracePeriod=600 Apr 16 17:45:37.003005 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.002954 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy-web" containerID="cri-o://e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" gracePeriod=600 Apr 16 17:45:37.003005 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.002944 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="thanos-sidecar" containerID="cri-o://38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" gracePeriod=600 Apr 16 17:45:37.003224 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.002900 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy" containerID="cri-o://b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" gracePeriod=600 Apr 16 17:45:37.226408 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.226388 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.252791 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252735 2570 generic.go:358] "Generic (PLEG): container finished" podID="146f79bd-0caa-448e-9c3a-ad348c633471" containerID="e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" exitCode=0 Apr 16 17:45:37.252791 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252758 2570 generic.go:358] "Generic (PLEG): container finished" podID="146f79bd-0caa-448e-9c3a-ad348c633471" containerID="b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" exitCode=0 Apr 16 17:45:37.252791 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252768 2570 generic.go:358] "Generic (PLEG): container finished" podID="146f79bd-0caa-448e-9c3a-ad348c633471" containerID="e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" exitCode=0 Apr 16 17:45:37.252791 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252779 2570 generic.go:358] "Generic (PLEG): container finished" podID="146f79bd-0caa-448e-9c3a-ad348c633471" containerID="38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" exitCode=0 Apr 16 17:45:37.252791 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252787 2570 generic.go:358] "Generic (PLEG): container finished" podID="146f79bd-0caa-448e-9c3a-ad348c633471" containerID="134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" exitCode=0 Apr 16 17:45:37.252791 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252795 2570 generic.go:358] "Generic (PLEG): container finished" podID="146f79bd-0caa-448e-9c3a-ad348c633471" containerID="763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" exitCode=0 Apr 16 17:45:37.253052 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252815 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerDied","Data":"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850"} Apr 16 17:45:37.253052 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252835 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.253052 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252850 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerDied","Data":"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7"} Apr 16 17:45:37.253052 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252861 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerDied","Data":"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8"} Apr 16 17:45:37.253052 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252872 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerDied","Data":"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842"} Apr 16 17:45:37.253052 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252886 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerDied","Data":"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb"} Apr 16 17:45:37.253052 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252898 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerDied","Data":"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9"} Apr 16 17:45:37.253052 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252906 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"146f79bd-0caa-448e-9c3a-ad348c633471","Type":"ContainerDied","Data":"945b5c76b6bbb2092ff140b27c39e0f61e9f9965f340b9a90474a93653989d18"} Apr 16 17:45:37.253052 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.252916 2570 scope.go:117] "RemoveContainer" containerID="e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" Apr 16 17:45:37.260689 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.260482 2570 scope.go:117] "RemoveContainer" containerID="b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" Apr 16 17:45:37.268441 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.268405 2570 scope.go:117] "RemoveContainer" containerID="e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" Apr 16 17:45:37.275019 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.274987 2570 scope.go:117] "RemoveContainer" containerID="38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" Apr 16 17:45:37.282489 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.282465 2570 scope.go:117] "RemoveContainer" containerID="134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" Apr 16 17:45:37.290020 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.290003 2570 scope.go:117] "RemoveContainer" containerID="763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" Apr 16 17:45:37.297243 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.297226 2570 scope.go:117] "RemoveContainer" containerID="131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e" Apr 16 17:45:37.303610 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.303595 2570 scope.go:117] "RemoveContainer" containerID="e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" Apr 16 17:45:37.303872 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:45:37.303854 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": container with ID starting with e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850 not found: ID does not exist" containerID="e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" Apr 16 17:45:37.303922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.303881 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850"} err="failed to get container status \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": rpc error: code = NotFound desc = could not find container \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": container with ID starting with e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850 not found: ID does not exist" Apr 16 17:45:37.303922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.303913 2570 scope.go:117] "RemoveContainer" containerID="b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" Apr 16 17:45:37.304161 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:45:37.304144 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": container with ID starting with b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7 not found: ID does not exist" containerID="b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" Apr 16 17:45:37.304203 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.304167 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7"} err="failed to get container status \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": rpc error: code = NotFound desc = could not find container \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": container with ID starting with b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7 not found: ID does not exist" Apr 16 17:45:37.304203 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.304183 2570 scope.go:117] "RemoveContainer" containerID="e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" Apr 16 17:45:37.304410 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:45:37.304395 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": container with ID starting with e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8 not found: ID does not exist" containerID="e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" Apr 16 17:45:37.304451 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.304413 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8"} err="failed to get container status \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": rpc error: code = NotFound desc = could not find container \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": container with ID starting with e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8 not found: ID does not exist" Apr 16 17:45:37.304451 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.304431 2570 scope.go:117] "RemoveContainer" containerID="38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" Apr 16 17:45:37.304699 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:45:37.304676 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": container with ID starting with 38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842 not found: ID does not exist" containerID="38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" Apr 16 17:45:37.304743 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.304710 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842"} err="failed to get container status \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": rpc error: code = NotFound desc = could not find container \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": container with ID starting with 38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842 not found: ID does not exist" Apr 16 17:45:37.304743 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.304730 2570 scope.go:117] "RemoveContainer" containerID="134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" Apr 16 17:45:37.304951 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:45:37.304932 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": container with ID starting with 134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb not found: ID does not exist" containerID="134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" Apr 16 17:45:37.305047 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.304953 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb"} err="failed to get container status \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": rpc error: code = NotFound desc = could not find container \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": container with ID starting with 134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb not found: ID does not exist" Apr 16 17:45:37.305047 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.304970 2570 scope.go:117] "RemoveContainer" containerID="763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" Apr 16 17:45:37.305180 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:45:37.305156 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": container with ID starting with 763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9 not found: ID does not exist" containerID="763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" Apr 16 17:45:37.305217 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.305189 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9"} err="failed to get container status \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": rpc error: code = NotFound desc = could not find container \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": container with ID starting with 763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9 not found: ID does not exist" Apr 16 17:45:37.305217 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.305210 2570 scope.go:117] "RemoveContainer" containerID="131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e" Apr 16 17:45:37.305438 ip-10-0-140-62 kubenswrapper[2570]: E0416 17:45:37.305420 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": container with ID starting with 131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e not found: ID does not exist" containerID="131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e" Apr 16 17:45:37.305536 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.305444 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e"} err="failed to get container status \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": rpc error: code = NotFound desc = could not find container \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": container with ID starting with 131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e not found: ID does not exist" Apr 16 17:45:37.305536 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.305464 2570 scope.go:117] "RemoveContainer" containerID="e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" Apr 16 17:45:37.305707 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.305689 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850"} err="failed to get container status \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": rpc error: code = NotFound desc = could not find container \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": container with ID starting with e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850 not found: ID does not exist" Apr 16 17:45:37.305761 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.305709 2570 scope.go:117] "RemoveContainer" containerID="b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" Apr 16 17:45:37.305921 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.305902 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7"} err="failed to get container status \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": rpc error: code = NotFound desc = could not find container \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": container with ID starting with b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7 not found: ID does not exist" Apr 16 17:45:37.305987 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.305924 2570 scope.go:117] "RemoveContainer" containerID="e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" Apr 16 17:45:37.306128 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306113 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8"} err="failed to get container status \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": rpc error: code = NotFound desc = could not find container \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": container with ID starting with e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8 not found: ID does not exist" Apr 16 17:45:37.306128 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306128 2570 scope.go:117] "RemoveContainer" containerID="38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" Apr 16 17:45:37.306326 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306309 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842"} err="failed to get container status \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": rpc error: code = NotFound desc = could not find container \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": container with ID starting with 38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842 not found: ID does not exist" Apr 16 17:45:37.306394 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306328 2570 scope.go:117] "RemoveContainer" containerID="134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" Apr 16 17:45:37.306560 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306542 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb"} err="failed to get container status \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": rpc error: code = NotFound desc = could not find container \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": container with ID starting with 134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb not found: ID does not exist" Apr 16 17:45:37.306614 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306563 2570 scope.go:117] "RemoveContainer" containerID="763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" Apr 16 17:45:37.306759 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306740 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9"} err="failed to get container status \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": rpc error: code = NotFound desc = could not find container \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": container with ID starting with 763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9 not found: ID does not exist" Apr 16 17:45:37.306824 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306761 2570 scope.go:117] "RemoveContainer" containerID="131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e" Apr 16 17:45:37.306952 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306934 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e"} err="failed to get container status \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": rpc error: code = NotFound desc = could not find container \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": container with ID starting with 131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e not found: ID does not exist" Apr 16 17:45:37.307028 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.306953 2570 scope.go:117] "RemoveContainer" containerID="e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" Apr 16 17:45:37.307157 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.307139 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850"} err="failed to get container status \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": rpc error: code = NotFound desc = could not find container \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": container with ID starting with e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850 not found: ID does not exist" Apr 16 17:45:37.307201 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.307158 2570 scope.go:117] "RemoveContainer" containerID="b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" Apr 16 17:45:37.307378 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.307361 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7"} err="failed to get container status \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": rpc error: code = NotFound desc = could not find container \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": container with ID starting with b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7 not found: ID does not exist" Apr 16 17:45:37.307447 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.307380 2570 scope.go:117] "RemoveContainer" containerID="e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" Apr 16 17:45:37.307663 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.307645 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8"} err="failed to get container status \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": rpc error: code = NotFound desc = could not find container \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": container with ID starting with e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8 not found: ID does not exist" Apr 16 17:45:37.307716 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.307664 2570 scope.go:117] "RemoveContainer" containerID="38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" Apr 16 17:45:37.307885 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.307868 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842"} err="failed to get container status \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": rpc error: code = NotFound desc = could not find container \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": container with ID starting with 38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842 not found: ID does not exist" Apr 16 17:45:37.307936 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.307885 2570 scope.go:117] "RemoveContainer" containerID="134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" Apr 16 17:45:37.308083 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308068 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb"} err="failed to get container status \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": rpc error: code = NotFound desc = could not find container \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": container with ID starting with 134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb not found: ID does not exist" Apr 16 17:45:37.308134 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308083 2570 scope.go:117] "RemoveContainer" containerID="763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" Apr 16 17:45:37.308282 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308266 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9"} err="failed to get container status \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": rpc error: code = NotFound desc = could not find container \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": container with ID starting with 763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9 not found: ID does not exist" Apr 16 17:45:37.308332 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308282 2570 scope.go:117] "RemoveContainer" containerID="131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e" Apr 16 17:45:37.308473 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308458 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e"} err="failed to get container status \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": rpc error: code = NotFound desc = could not find container \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": container with ID starting with 131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e not found: ID does not exist" Apr 16 17:45:37.308539 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308474 2570 scope.go:117] "RemoveContainer" containerID="e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" Apr 16 17:45:37.308690 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308672 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850"} err="failed to get container status \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": rpc error: code = NotFound desc = could not find container \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": container with ID starting with e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850 not found: ID does not exist" Apr 16 17:45:37.308760 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308692 2570 scope.go:117] "RemoveContainer" containerID="b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" Apr 16 17:45:37.308921 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308904 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7"} err="failed to get container status \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": rpc error: code = NotFound desc = could not find container \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": container with ID starting with b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7 not found: ID does not exist" Apr 16 17:45:37.308980 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.308923 2570 scope.go:117] "RemoveContainer" containerID="e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" Apr 16 17:45:37.309171 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309155 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8"} err="failed to get container status \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": rpc error: code = NotFound desc = could not find container \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": container with ID starting with e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8 not found: ID does not exist" Apr 16 17:45:37.309236 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309172 2570 scope.go:117] "RemoveContainer" containerID="38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" Apr 16 17:45:37.309386 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309372 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842"} err="failed to get container status \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": rpc error: code = NotFound desc = could not find container \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": container with ID starting with 38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842 not found: ID does not exist" Apr 16 17:45:37.309446 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309386 2570 scope.go:117] "RemoveContainer" containerID="134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" Apr 16 17:45:37.309602 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309588 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb"} err="failed to get container status \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": rpc error: code = NotFound desc = could not find container \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": container with ID starting with 134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb not found: ID does not exist" Apr 16 17:45:37.309655 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309603 2570 scope.go:117] "RemoveContainer" containerID="763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" Apr 16 17:45:37.309785 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309770 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9"} err="failed to get container status \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": rpc error: code = NotFound desc = could not find container \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": container with ID starting with 763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9 not found: ID does not exist" Apr 16 17:45:37.309785 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309785 2570 scope.go:117] "RemoveContainer" containerID="131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e" Apr 16 17:45:37.309955 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309938 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e"} err="failed to get container status \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": rpc error: code = NotFound desc = could not find container \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": container with ID starting with 131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e not found: ID does not exist" Apr 16 17:45:37.310010 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.309955 2570 scope.go:117] "RemoveContainer" containerID="e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" Apr 16 17:45:37.310132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.310119 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850"} err="failed to get container status \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": rpc error: code = NotFound desc = could not find container \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": container with ID starting with e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850 not found: ID does not exist" Apr 16 17:45:37.310179 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.310132 2570 scope.go:117] "RemoveContainer" containerID="b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" Apr 16 17:45:37.310344 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.310327 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7"} err="failed to get container status \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": rpc error: code = NotFound desc = could not find container \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": container with ID starting with b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7 not found: ID does not exist" Apr 16 17:45:37.310387 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.310344 2570 scope.go:117] "RemoveContainer" containerID="e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" Apr 16 17:45:37.310629 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.310603 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8"} err="failed to get container status \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": rpc error: code = NotFound desc = could not find container \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": container with ID starting with e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8 not found: ID does not exist" Apr 16 17:45:37.310700 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.310630 2570 scope.go:117] "RemoveContainer" containerID="38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" Apr 16 17:45:37.310874 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.310856 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842"} err="failed to get container status \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": rpc error: code = NotFound desc = could not find container \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": container with ID starting with 38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842 not found: ID does not exist" Apr 16 17:45:37.310922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.310875 2570 scope.go:117] "RemoveContainer" containerID="134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" Apr 16 17:45:37.311115 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311097 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb"} err="failed to get container status \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": rpc error: code = NotFound desc = could not find container \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": container with ID starting with 134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb not found: ID does not exist" Apr 16 17:45:37.311115 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311114 2570 scope.go:117] "RemoveContainer" containerID="763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" Apr 16 17:45:37.311303 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311287 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9"} err="failed to get container status \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": rpc error: code = NotFound desc = could not find container \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": container with ID starting with 763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9 not found: ID does not exist" Apr 16 17:45:37.311343 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311303 2570 scope.go:117] "RemoveContainer" containerID="131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e" Apr 16 17:45:37.311524 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311485 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e"} err="failed to get container status \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": rpc error: code = NotFound desc = could not find container \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": container with ID starting with 131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e not found: ID does not exist" Apr 16 17:45:37.311604 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311526 2570 scope.go:117] "RemoveContainer" containerID="e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850" Apr 16 17:45:37.311746 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311730 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850"} err="failed to get container status \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": rpc error: code = NotFound desc = could not find container \"e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850\": container with ID starting with e283d5572a43bb450f1f23d383171f8db4e17d3b6d755a47ef51fbde0cef5850 not found: ID does not exist" Apr 16 17:45:37.311802 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311747 2570 scope.go:117] "RemoveContainer" containerID="b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7" Apr 16 17:45:37.311922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311904 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7"} err="failed to get container status \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": rpc error: code = NotFound desc = could not find container \"b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7\": container with ID starting with b06c5dcbd7e216ed8d2f84fb995051274ac604b4d72a5aa085dad527bc76eba7 not found: ID does not exist" Apr 16 17:45:37.311963 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.311923 2570 scope.go:117] "RemoveContainer" containerID="e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8" Apr 16 17:45:37.312168 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.312149 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8"} err="failed to get container status \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": rpc error: code = NotFound desc = could not find container \"e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8\": container with ID starting with e99b0fa8aefe002d234e0f83154f53eb62bac5a28ed7597d5b745f379a58e1a8 not found: ID does not exist" Apr 16 17:45:37.312168 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.312168 2570 scope.go:117] "RemoveContainer" containerID="38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842" Apr 16 17:45:37.312406 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.312382 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842"} err="failed to get container status \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": rpc error: code = NotFound desc = could not find container \"38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842\": container with ID starting with 38ed8438e118700aa049516397789da4b5a9830b6e2b38e8d1107906c81ce842 not found: ID does not exist" Apr 16 17:45:37.312450 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.312409 2570 scope.go:117] "RemoveContainer" containerID="134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb" Apr 16 17:45:37.312618 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.312599 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb"} err="failed to get container status \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": rpc error: code = NotFound desc = could not find container \"134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb\": container with ID starting with 134f48b7b2fb9454a0cb61bb560642a588f78edf8ad509ff5ab96fe8bb3959fb not found: ID does not exist" Apr 16 17:45:37.312684 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.312618 2570 scope.go:117] "RemoveContainer" containerID="763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9" Apr 16 17:45:37.312856 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.312832 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9"} err="failed to get container status \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": rpc error: code = NotFound desc = could not find container \"763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9\": container with ID starting with 763171447b26ceffc09679104622802eef77202b2ec0d4d980da46b11072b7e9 not found: ID does not exist" Apr 16 17:45:37.312909 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.312857 2570 scope.go:117] "RemoveContainer" containerID="131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e" Apr 16 17:45:37.313087 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.313070 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e"} err="failed to get container status \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": rpc error: code = NotFound desc = could not find container \"131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e\": container with ID starting with 131c626b5b14eb0d72513d03fc944b71ff3ba49b8f8bd7745d09781d5d078b1e not found: ID does not exist" Apr 16 17:45:37.316267 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316249 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-kubelet-serving-ca-bundle\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316354 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316279 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-tls-assets\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316354 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316311 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-db\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316354 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316334 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-grpc-tls\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316535 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316443 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-thanos-prometheus-http-client-file\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316535 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316477 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316535 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316529 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-tls\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316699 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316555 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316699 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316580 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-metrics-client-certs\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316699 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316617 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-config\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316699 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316667 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76jpl\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-kube-api-access-76jpl\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.316699 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316671 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:37.317233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316714 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-metrics-client-ca\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.317233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316743 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-config-out\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.317233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316777 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-kube-rbac-proxy\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.317233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316800 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-web-config\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.317233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316844 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-serving-certs-ca-bundle\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.317233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316871 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-rulefiles-0\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.317233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.316902 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-trusted-ca-bundle\") pod \"146f79bd-0caa-448e-9c3a-ad348c633471\" (UID: \"146f79bd-0caa-448e-9c3a-ad348c633471\") " Apr 16 17:45:37.317233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.317151 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:45:37.317233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.317204 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.317919 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.317900 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:45:37.318345 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.318321 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:37.319218 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.319195 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:37.319415 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.319397 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:37.320133 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.319861 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:37.320133 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.319934 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:37.320133 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.319993 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:37.320133 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.320091 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d21ef6-c2df-4bbd-8185-bf4fff5cb835-metrics-certs\") pod \"network-metrics-daemon-l7h7z\" (UID: \"38d21ef6-c2df-4bbd-8185-bf4fff5cb835\") " pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:45:37.320401 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.320130 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:37.320401 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.320241 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:37.320636 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.320592 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:45:37.321220 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.321192 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:37.321364 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.321337 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:37.321456 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.321411 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-config-out" (OuterVolumeSpecName: "config-out") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:45:37.321456 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.321437 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:37.321769 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.321754 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-kube-api-access-76jpl" (OuterVolumeSpecName: "kube-api-access-76jpl") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "kube-api-access-76jpl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:45:37.321961 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.321944 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-config" (OuterVolumeSpecName: "config") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:37.331116 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.331096 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-web-config" (OuterVolumeSpecName: "web-config") pod "146f79bd-0caa-448e-9c3a-ad348c633471" (UID: "146f79bd-0caa-448e-9c3a-ad348c633471"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:37.418064 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418043 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418064 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418075 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418085 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-tls-assets\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418094 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-prometheus-k8s-db\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418102 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-grpc-tls\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418111 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418121 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418132 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418366 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418141 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418366 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418152 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-metrics-client-certs\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418366 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418166 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-config\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418366 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418180 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-76jpl\" (UniqueName: \"kubernetes.io/projected/146f79bd-0caa-448e-9c3a-ad348c633471-kube-api-access-76jpl\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418366 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418188 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/146f79bd-0caa-448e-9c3a-ad348c633471-configmap-metrics-client-ca\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418366 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418197 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/146f79bd-0caa-448e-9c3a-ad348c633471-config-out\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418366 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418205 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-secret-kube-rbac-proxy\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.418366 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.418213 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/146f79bd-0caa-448e-9c3a-ad348c633471-web-config\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:45:37.430682 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.430665 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8xx6l\"" Apr 16 17:45:37.439010 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.438987 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l7h7z" Apr 16 17:45:37.550380 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.550356 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l7h7z"] Apr 16 17:45:37.553028 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:45:37.553005 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d21ef6_c2df_4bbd_8185_bf4fff5cb835.slice/crio-c756a7ed0c123e5564c7e3f0ac88a0406632298ce11ac4b4ce08f0e18c3c21e9 WatchSource:0}: Error finding container c756a7ed0c123e5564c7e3f0ac88a0406632298ce11ac4b4ce08f0e18c3c21e9: Status 404 returned error can't find the container with id c756a7ed0c123e5564c7e3f0ac88a0406632298ce11ac4b4ce08f0e18c3c21e9 Apr 16 17:45:37.571759 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.571739 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:45:37.575861 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.575841 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:45:37.602248 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602222 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:45:37.602499 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602483 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy-web" Apr 16 17:45:37.602499 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602516 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy-web" Apr 16 17:45:37.602499 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602531 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="prometheus" Apr 16 17:45:37.602499 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602539 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="prometheus" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602552 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="thanos-sidecar" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602561 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="thanos-sidecar" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602573 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="config-reloader" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602582 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="config-reloader" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602595 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="init-config-reloader" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602604 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="init-config-reloader" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602613 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy-thanos" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602620 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy-thanos" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602631 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602638 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602715 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="config-reloader" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602727 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy-thanos" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602742 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="prometheus" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602751 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy-web" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602760 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="kube-rbac-proxy" Apr 16 17:45:37.602768 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.602770 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" containerName="thanos-sidecar" Apr 16 17:45:37.607964 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.607947 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.610761 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.610740 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 17:45:37.610851 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.610785 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 17:45:37.610907 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.610878 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 17:45:37.610907 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.610882 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9lm9q\"" Apr 16 17:45:37.611010 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.610997 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 17:45:37.611059 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.611017 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 17:45:37.611059 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.611041 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 17:45:37.611163 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.611110 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 17:45:37.611333 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.611317 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 17:45:37.611395 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.611344 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 17:45:37.611466 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.611449 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 17:45:37.611961 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.611947 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-f1cc48p2uj6v5\"" Apr 16 17:45:37.612031 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.611992 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 17:45:37.615010 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.614994 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 17:45:37.616484 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.616465 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 17:45:37.625331 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.625314 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:45:37.720728 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720705 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bca16113-e298-4206-bb9e-2df6ce1ee175-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.720813 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.720813 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720754 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.720813 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720784 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.720922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720835 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.720922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720873 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.720922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720898 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.720922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720916 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721050 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720943 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721050 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720965 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721050 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.720990 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-config\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721050 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.721006 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bca16113-e298-4206-bb9e-2df6ce1ee175-config-out\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721050 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.721027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721050 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.721050 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-web-config\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721235 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.721069 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721235 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.721128 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721235 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.721154 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/bca16113-e298-4206-bb9e-2df6ce1ee175-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.721235 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.721195 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9qh6\" (UniqueName: \"kubernetes.io/projected/bca16113-e298-4206-bb9e-2df6ce1ee175-kube-api-access-k9qh6\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822243 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/bca16113-e298-4206-bb9e-2df6ce1ee175-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822339 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9qh6\" (UniqueName: \"kubernetes.io/projected/bca16113-e298-4206-bb9e-2df6ce1ee175-kube-api-access-k9qh6\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822339 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bca16113-e298-4206-bb9e-2df6ce1ee175-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822339 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822282 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822339 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822300 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822339 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822587 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822766 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822742 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822841 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822811 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822895 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822842 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.822895 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822878 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823005 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822908 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823005 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-config\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823005 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.822985 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bca16113-e298-4206-bb9e-2df6ce1ee175-config-out\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823138 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.823010 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/bca16113-e298-4206-bb9e-2df6ce1ee175-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823933 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.823416 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823933 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.823014 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823933 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.823615 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-web-config\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823933 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.823663 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823933 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.823686 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.823933 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.823710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.825352 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.825322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.826285 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.826258 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bca16113-e298-4206-bb9e-2df6ce1ee175-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.826370 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.826344 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831385 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.828677 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bca16113-e298-4206-bb9e-2df6ce1ee175-config-out\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831385 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.828769 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831385 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.828798 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-web-config\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831385 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.829157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831385 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.829732 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-config\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831385 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.830202 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831385 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.830390 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bca16113-e298-4206-bb9e-2df6ce1ee175-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831879 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.831846 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831991 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.831858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9qh6\" (UniqueName: \"kubernetes.io/projected/bca16113-e298-4206-bb9e-2df6ce1ee175-kube-api-access-k9qh6\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.831991 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.831908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.833002 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.832985 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.833222 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.833202 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/bca16113-e298-4206-bb9e-2df6ce1ee175-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"bca16113-e298-4206-bb9e-2df6ce1ee175\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:37.917562 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:37.917536 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:45:38.062579 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:38.062552 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:45:38.066459 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:45:38.066429 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbca16113_e298_4206_bb9e_2df6ce1ee175.slice/crio-0d5584b84493f93d36349c0fce8afb163757646a9da8aac1301208b67549a09f WatchSource:0}: Error finding container 0d5584b84493f93d36349c0fce8afb163757646a9da8aac1301208b67549a09f: Status 404 returned error can't find the container with id 0d5584b84493f93d36349c0fce8afb163757646a9da8aac1301208b67549a09f Apr 16 17:45:38.258235 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:38.258192 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l7h7z" event={"ID":"38d21ef6-c2df-4bbd-8185-bf4fff5cb835","Type":"ContainerStarted","Data":"c756a7ed0c123e5564c7e3f0ac88a0406632298ce11ac4b4ce08f0e18c3c21e9"} Apr 16 17:45:38.259687 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:38.259662 2570 generic.go:358] "Generic (PLEG): container finished" podID="bca16113-e298-4206-bb9e-2df6ce1ee175" containerID="9e45911e09f0af24c4afab3f74223b63d026ca2b4b6233f165efdb759725b23f" exitCode=0 Apr 16 17:45:38.259805 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:38.259719 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bca16113-e298-4206-bb9e-2df6ce1ee175","Type":"ContainerDied","Data":"9e45911e09f0af24c4afab3f74223b63d026ca2b4b6233f165efdb759725b23f"} Apr 16 17:45:38.259805 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:38.259754 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bca16113-e298-4206-bb9e-2df6ce1ee175","Type":"ContainerStarted","Data":"0d5584b84493f93d36349c0fce8afb163757646a9da8aac1301208b67549a09f"} Apr 16 17:45:39.263922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.263891 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l7h7z" event={"ID":"38d21ef6-c2df-4bbd-8185-bf4fff5cb835","Type":"ContainerStarted","Data":"75c89368d1d6395fd6b452a6bb098e5536d764abb9a6bf540bbd602c08a279fd"} Apr 16 17:45:39.263922 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.263926 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l7h7z" event={"ID":"38d21ef6-c2df-4bbd-8185-bf4fff5cb835","Type":"ContainerStarted","Data":"13d40177274fdc5364331cd007c30b2829170cc74903e5a59c3e19ac6880a1eb"} Apr 16 17:45:39.266558 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.266535 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bca16113-e298-4206-bb9e-2df6ce1ee175","Type":"ContainerStarted","Data":"0fb6cfd09627322b5283f576f1ce8d1c92b9850f3915e4072d11323369c06eb7"} Apr 16 17:45:39.266642 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.266563 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bca16113-e298-4206-bb9e-2df6ce1ee175","Type":"ContainerStarted","Data":"e24b6168b209596ea370db4879e599a8d2ef020029b51899ae05a8d062b66b28"} Apr 16 17:45:39.266642 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.266572 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bca16113-e298-4206-bb9e-2df6ce1ee175","Type":"ContainerStarted","Data":"f7366892ec54d7e23b457a673141316f062c38d52e39a5317639176649699717"} Apr 16 17:45:39.266642 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.266580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bca16113-e298-4206-bb9e-2df6ce1ee175","Type":"ContainerStarted","Data":"fedabec81085975a46f3d2d39aa28167d1c285db06790006a46c10253a2d70ba"} Apr 16 17:45:39.266642 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.266588 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bca16113-e298-4206-bb9e-2df6ce1ee175","Type":"ContainerStarted","Data":"92fd8d879c4c1234a1cdef38719c6bb7497005ea251f3ab5969f0700d81b91a9"} Apr 16 17:45:39.266642 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.266595 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bca16113-e298-4206-bb9e-2df6ce1ee175","Type":"ContainerStarted","Data":"f8a0a1fed77bf2d6685818cc8d0e01b4bd81a65196651caabdf272b6c71dc59f"} Apr 16 17:45:39.281936 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.281903 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l7h7z" podStartSLOduration=253.32949473 podStartE2EDuration="4m14.281891648s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:45:37.554947939 +0000 UTC m=+252.635169058" lastFinishedPulling="2026-04-16 17:45:38.507344865 +0000 UTC m=+253.587565976" observedRunningTime="2026-04-16 17:45:39.28103253 +0000 UTC m=+254.361253659" watchObservedRunningTime="2026-04-16 17:45:39.281891648 +0000 UTC m=+254.362112776" Apr 16 17:45:39.311146 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.311104 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.311088115 podStartE2EDuration="2.311088115s" podCreationTimestamp="2026-04-16 17:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:45:39.308809743 +0000 UTC m=+254.389030873" watchObservedRunningTime="2026-04-16 17:45:39.311088115 +0000 UTC m=+254.391309249" Apr 16 17:45:39.528233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:39.528211 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146f79bd-0caa-448e-9c3a-ad348c633471" path="/var/lib/kubelet/pods/146f79bd-0caa-448e-9c3a-ad348c633471/volumes" Apr 16 17:45:42.918583 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:45:42.918551 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:46:14.131739 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.131703 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5qxm9"] Apr 16 17:46:14.138000 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.137968 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.140895 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.140876 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 17:46:14.143274 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.143252 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5qxm9"] Apr 16 17:46:14.164812 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.164793 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4575a94e-42fb-4e34-8ae8-0eb4f54300c3-original-pull-secret\") pod \"global-pull-secret-syncer-5qxm9\" (UID: \"4575a94e-42fb-4e34-8ae8-0eb4f54300c3\") " pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.164929 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.164834 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4575a94e-42fb-4e34-8ae8-0eb4f54300c3-dbus\") pod \"global-pull-secret-syncer-5qxm9\" (UID: \"4575a94e-42fb-4e34-8ae8-0eb4f54300c3\") " pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.164929 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.164900 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4575a94e-42fb-4e34-8ae8-0eb4f54300c3-kubelet-config\") pod \"global-pull-secret-syncer-5qxm9\" (UID: \"4575a94e-42fb-4e34-8ae8-0eb4f54300c3\") " pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.266023 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.265997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4575a94e-42fb-4e34-8ae8-0eb4f54300c3-kubelet-config\") pod \"global-pull-secret-syncer-5qxm9\" (UID: \"4575a94e-42fb-4e34-8ae8-0eb4f54300c3\") " pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.266125 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.266061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4575a94e-42fb-4e34-8ae8-0eb4f54300c3-original-pull-secret\") pod \"global-pull-secret-syncer-5qxm9\" (UID: \"4575a94e-42fb-4e34-8ae8-0eb4f54300c3\") " pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.266125 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.266104 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4575a94e-42fb-4e34-8ae8-0eb4f54300c3-dbus\") pod \"global-pull-secret-syncer-5qxm9\" (UID: \"4575a94e-42fb-4e34-8ae8-0eb4f54300c3\") " pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.266232 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.266121 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4575a94e-42fb-4e34-8ae8-0eb4f54300c3-kubelet-config\") pod \"global-pull-secret-syncer-5qxm9\" (UID: \"4575a94e-42fb-4e34-8ae8-0eb4f54300c3\") " pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.266286 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.266265 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4575a94e-42fb-4e34-8ae8-0eb4f54300c3-dbus\") pod \"global-pull-secret-syncer-5qxm9\" (UID: \"4575a94e-42fb-4e34-8ae8-0eb4f54300c3\") " pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.268148 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.268129 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4575a94e-42fb-4e34-8ae8-0eb4f54300c3-original-pull-secret\") pod \"global-pull-secret-syncer-5qxm9\" (UID: \"4575a94e-42fb-4e34-8ae8-0eb4f54300c3\") " pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.448270 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.448217 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5qxm9" Apr 16 17:46:14.561715 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:14.561562 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5qxm9"] Apr 16 17:46:14.564270 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:46:14.564245 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4575a94e_42fb_4e34_8ae8_0eb4f54300c3.slice/crio-8dfb96cd95c98072112f74f4f72ecbd36ce7d346f709a5f6b4751a28c765c47d WatchSource:0}: Error finding container 8dfb96cd95c98072112f74f4f72ecbd36ce7d346f709a5f6b4751a28c765c47d: Status 404 returned error can't find the container with id 8dfb96cd95c98072112f74f4f72ecbd36ce7d346f709a5f6b4751a28c765c47d Apr 16 17:46:15.366467 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:15.366431 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5qxm9" event={"ID":"4575a94e-42fb-4e34-8ae8-0eb4f54300c3","Type":"ContainerStarted","Data":"8dfb96cd95c98072112f74f4f72ecbd36ce7d346f709a5f6b4751a28c765c47d"} Apr 16 17:46:19.379926 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:19.379892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5qxm9" event={"ID":"4575a94e-42fb-4e34-8ae8-0eb4f54300c3","Type":"ContainerStarted","Data":"73a0050b22feff1a34e5e2d4ac33a361e365b4e654e37c5f2537453bb2a185fd"} Apr 16 17:46:19.395888 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:19.395851 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5qxm9" podStartSLOduration=1.547458872 podStartE2EDuration="5.395837782s" podCreationTimestamp="2026-04-16 17:46:14 +0000 UTC" firstStartedPulling="2026-04-16 17:46:14.565814549 +0000 UTC m=+289.646035656" lastFinishedPulling="2026-04-16 17:46:18.41419346 +0000 UTC m=+293.494414566" observedRunningTime="2026-04-16 17:46:19.395020435 +0000 UTC m=+294.475241578" watchObservedRunningTime="2026-04-16 17:46:19.395837782 +0000 UTC m=+294.476058911" Apr 16 17:46:25.359915 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:25.359891 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 17:46:25.360364 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:25.359896 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 17:46:25.363851 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:25.363836 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:46:25.363960 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:25.363943 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:46:25.370844 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:25.370830 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 17:46:37.918571 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:37.918540 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:46:37.933102 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:37.933079 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:46:38.442237 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:46:38.442215 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:49:47.430189 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.430153 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-jbqzb"] Apr 16 17:49:47.433179 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.433164 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jbqzb" Apr 16 17:49:47.435809 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.435786 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 17:49:47.437006 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.436989 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-j5pjh\"" Apr 16 17:49:47.437076 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.437031 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 17:49:47.437132 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.437120 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 17:49:47.444015 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.443995 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jbqzb"] Apr 16 17:49:47.554437 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.554412 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4mfs\" (UniqueName: \"kubernetes.io/projected/ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed-kube-api-access-f4mfs\") pod \"s3-init-jbqzb\" (UID: \"ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed\") " pod="kserve/s3-init-jbqzb" Apr 16 17:49:47.655738 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.655716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4mfs\" (UniqueName: \"kubernetes.io/projected/ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed-kube-api-access-f4mfs\") pod \"s3-init-jbqzb\" (UID: \"ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed\") " pod="kserve/s3-init-jbqzb" Apr 16 17:49:47.663695 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.663677 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4mfs\" (UniqueName: \"kubernetes.io/projected/ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed-kube-api-access-f4mfs\") pod \"s3-init-jbqzb\" (UID: \"ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed\") " pod="kserve/s3-init-jbqzb" Apr 16 17:49:47.741851 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.741804 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jbqzb" Apr 16 17:49:47.857974 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.857945 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jbqzb"] Apr 16 17:49:47.860196 ip-10-0-140-62 kubenswrapper[2570]: W0416 17:49:47.860152 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce5d7dd7_aa5c_4bbe_8a32_4df1d77a9aed.slice/crio-09c50634173765f6a118f19d1a21c06866e3dfaac212cb85160872974aef487e WatchSource:0}: Error finding container 09c50634173765f6a118f19d1a21c06866e3dfaac212cb85160872974aef487e: Status 404 returned error can't find the container with id 09c50634173765f6a118f19d1a21c06866e3dfaac212cb85160872974aef487e Apr 16 17:49:47.862054 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.862033 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:49:47.902887 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:47.902859 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jbqzb" event={"ID":"ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed","Type":"ContainerStarted","Data":"09c50634173765f6a118f19d1a21c06866e3dfaac212cb85160872974aef487e"} Apr 16 17:49:52.920864 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:52.920823 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jbqzb" event={"ID":"ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed","Type":"ContainerStarted","Data":"28b8d862ed7fac03efa36b40f1874780cd74316dcdf9a6114f6514d57f63d6e5"} Apr 16 17:49:52.938571 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:52.938529 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-jbqzb" podStartSLOduration=1.444785621 podStartE2EDuration="5.938500025s" podCreationTimestamp="2026-04-16 17:49:47 +0000 UTC" firstStartedPulling="2026-04-16 17:49:47.862155717 +0000 UTC m=+502.942376828" lastFinishedPulling="2026-04-16 17:49:52.355870104 +0000 UTC m=+507.436091232" observedRunningTime="2026-04-16 17:49:52.93767681 +0000 UTC m=+508.017897938" watchObservedRunningTime="2026-04-16 17:49:52.938500025 +0000 UTC m=+508.018721150" Apr 16 17:49:55.930395 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:55.930358 2570 generic.go:358] "Generic (PLEG): container finished" podID="ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed" containerID="28b8d862ed7fac03efa36b40f1874780cd74316dcdf9a6114f6514d57f63d6e5" exitCode=0 Apr 16 17:49:55.930773 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:55.930422 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jbqzb" event={"ID":"ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed","Type":"ContainerDied","Data":"28b8d862ed7fac03efa36b40f1874780cd74316dcdf9a6114f6514d57f63d6e5"} Apr 16 17:49:57.051006 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:57.050985 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jbqzb" Apr 16 17:49:57.132163 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:57.132137 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4mfs\" (UniqueName: \"kubernetes.io/projected/ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed-kube-api-access-f4mfs\") pod \"ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed\" (UID: \"ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed\") " Apr 16 17:49:57.134327 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:57.134305 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed-kube-api-access-f4mfs" (OuterVolumeSpecName: "kube-api-access-f4mfs") pod "ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed" (UID: "ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed"). InnerVolumeSpecName "kube-api-access-f4mfs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:49:57.233233 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:57.233184 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4mfs\" (UniqueName: \"kubernetes.io/projected/ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed-kube-api-access-f4mfs\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 17:49:57.938330 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:57.938274 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jbqzb" Apr 16 17:49:57.938330 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:57.938296 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jbqzb" event={"ID":"ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed","Type":"ContainerDied","Data":"09c50634173765f6a118f19d1a21c06866e3dfaac212cb85160872974aef487e"} Apr 16 17:49:57.938330 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:49:57.938319 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c50634173765f6a118f19d1a21c06866e3dfaac212cb85160872974aef487e" Apr 16 17:51:25.379965 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:51:25.379939 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 17:51:25.382487 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:51:25.382469 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 17:51:25.383256 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:51:25.383239 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:51:25.385796 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:51:25.385771 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:56:25.399204 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:56:25.399178 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 17:56:25.402281 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:56:25.402256 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 17:56:25.402639 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:56:25.402621 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 17:56:25.405901 ip-10-0-140-62 kubenswrapper[2570]: I0416 17:56:25.405883 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 18:01:25.417749 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:01:25.417720 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 18:01:25.426145 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:01:25.426117 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 18:01:25.427571 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:01:25.427548 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 18:01:25.431293 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:01:25.431277 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 18:02:40.822840 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.822807 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-64jgr/must-gather-sqp95"] Apr 16 18:02:40.827074 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.823068 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed" containerName="s3-init" Apr 16 18:02:40.827074 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.823078 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed" containerName="s3-init" Apr 16 18:02:40.827074 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.823133 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed" containerName="s3-init" Apr 16 18:02:40.827815 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.827801 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:02:40.830918 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.830901 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-64jgr\"/\"kube-root-ca.crt\"" Apr 16 18:02:40.831001 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.830978 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-64jgr\"/\"openshift-service-ca.crt\"" Apr 16 18:02:40.832106 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.832089 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-64jgr\"/\"default-dockercfg-rm6sp\"" Apr 16 18:02:40.862502 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.862477 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-64jgr/must-gather-sqp95"] Apr 16 18:02:40.923042 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.923018 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/851e87a9-928c-45f3-b64d-7c1613af030f-must-gather-output\") pod \"must-gather-sqp95\" (UID: \"851e87a9-928c-45f3-b64d-7c1613af030f\") " pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:02:40.923130 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:40.923056 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5qdq\" (UniqueName: \"kubernetes.io/projected/851e87a9-928c-45f3-b64d-7c1613af030f-kube-api-access-v5qdq\") pod \"must-gather-sqp95\" (UID: \"851e87a9-928c-45f3-b64d-7c1613af030f\") " pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:02:41.023893 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:41.023869 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/851e87a9-928c-45f3-b64d-7c1613af030f-must-gather-output\") pod \"must-gather-sqp95\" (UID: \"851e87a9-928c-45f3-b64d-7c1613af030f\") " pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:02:41.023975 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:41.023906 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5qdq\" (UniqueName: \"kubernetes.io/projected/851e87a9-928c-45f3-b64d-7c1613af030f-kube-api-access-v5qdq\") pod \"must-gather-sqp95\" (UID: \"851e87a9-928c-45f3-b64d-7c1613af030f\") " pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:02:41.024154 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:41.024138 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/851e87a9-928c-45f3-b64d-7c1613af030f-must-gather-output\") pod \"must-gather-sqp95\" (UID: \"851e87a9-928c-45f3-b64d-7c1613af030f\") " pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:02:41.032236 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:41.032220 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5qdq\" (UniqueName: \"kubernetes.io/projected/851e87a9-928c-45f3-b64d-7c1613af030f-kube-api-access-v5qdq\") pod \"must-gather-sqp95\" (UID: \"851e87a9-928c-45f3-b64d-7c1613af030f\") " pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:02:41.141151 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:41.141098 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:02:41.259760 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:41.259739 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-64jgr/must-gather-sqp95"] Apr 16 18:02:41.261763 ip-10-0-140-62 kubenswrapper[2570]: W0416 18:02:41.261736 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod851e87a9_928c_45f3_b64d_7c1613af030f.slice/crio-d67908eae866065fcc8f9d42e0aff538af4ea9d052c204c6239e231198f56bca WatchSource:0}: Error finding container d67908eae866065fcc8f9d42e0aff538af4ea9d052c204c6239e231198f56bca: Status 404 returned error can't find the container with id d67908eae866065fcc8f9d42e0aff538af4ea9d052c204c6239e231198f56bca Apr 16 18:02:41.263417 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:41.263403 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:02:41.926454 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:41.926423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-64jgr/must-gather-sqp95" event={"ID":"851e87a9-928c-45f3-b64d-7c1613af030f","Type":"ContainerStarted","Data":"d67908eae866065fcc8f9d42e0aff538af4ea9d052c204c6239e231198f56bca"} Apr 16 18:02:46.944501 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:46.944417 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-64jgr/must-gather-sqp95" event={"ID":"851e87a9-928c-45f3-b64d-7c1613af030f","Type":"ContainerStarted","Data":"c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c"} Apr 16 18:02:46.944501 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:46.944467 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-64jgr/must-gather-sqp95" event={"ID":"851e87a9-928c-45f3-b64d-7c1613af030f","Type":"ContainerStarted","Data":"ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f"} Apr 16 18:02:46.966638 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:02:46.966592 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-64jgr/must-gather-sqp95" podStartSLOduration=1.906297568 podStartE2EDuration="6.966578093s" podCreationTimestamp="2026-04-16 18:02:40 +0000 UTC" firstStartedPulling="2026-04-16 18:02:41.263561538 +0000 UTC m=+1276.343782661" lastFinishedPulling="2026-04-16 18:02:46.323842075 +0000 UTC m=+1281.404063186" observedRunningTime="2026-04-16 18:02:46.966106463 +0000 UTC m=+1282.046327592" watchObservedRunningTime="2026-04-16 18:02:46.966578093 +0000 UTC m=+1282.046799222" Apr 16 18:03:03.996146 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:03.996112 2570 generic.go:358] "Generic (PLEG): container finished" podID="851e87a9-928c-45f3-b64d-7c1613af030f" containerID="ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f" exitCode=0 Apr 16 18:03:03.996537 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:03.996181 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-64jgr/must-gather-sqp95" event={"ID":"851e87a9-928c-45f3-b64d-7c1613af030f","Type":"ContainerDied","Data":"ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f"} Apr 16 18:03:03.996537 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:03.996445 2570 scope.go:117] "RemoveContainer" containerID="ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f" Apr 16 18:03:04.431893 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:04.431869 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-64jgr_must-gather-sqp95_851e87a9-928c-45f3-b64d-7c1613af030f/gather/0.log" Apr 16 18:03:07.625802 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:07.621835 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5qxm9_4575a94e-42fb-4e34-8ae8-0eb4f54300c3/global-pull-secret-syncer/0.log" Apr 16 18:03:07.837374 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:07.837346 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tl2gd_8e15b302-e2d8-4a43-85d6-2c1a3bb9b319/konnectivity-agent/0.log" Apr 16 18:03:07.883892 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:07.883833 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-62.ec2.internal_045b24ad5fe56c3ffbf4f39e4a48e404/haproxy/0.log" Apr 16 18:03:09.787862 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:09.787817 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-64jgr/must-gather-sqp95"] Apr 16 18:03:09.788219 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:09.788029 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-64jgr/must-gather-sqp95" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" containerName="copy" containerID="cri-o://c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c" gracePeriod=2 Apr 16 18:03:09.794866 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:09.794841 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-64jgr/must-gather-sqp95"] Apr 16 18:03:10.006892 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.006864 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-64jgr_must-gather-sqp95_851e87a9-928c-45f3-b64d-7c1613af030f/copy/0.log" Apr 16 18:03:10.007202 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.007189 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:03:10.010246 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.010219 2570 status_manager.go:895] "Failed to get status for pod" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" pod="openshift-must-gather-64jgr/must-gather-sqp95" err="pods \"must-gather-sqp95\" is forbidden: User \"system:node:ip-10-0-140-62.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-64jgr\": no relationship found between node 'ip-10-0-140-62.ec2.internal' and this object" Apr 16 18:03:10.015345 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.015326 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-64jgr_must-gather-sqp95_851e87a9-928c-45f3-b64d-7c1613af030f/copy/0.log" Apr 16 18:03:10.015637 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.015616 2570 generic.go:358] "Generic (PLEG): container finished" podID="851e87a9-928c-45f3-b64d-7c1613af030f" containerID="c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c" exitCode=143 Apr 16 18:03:10.015701 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.015659 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-64jgr/must-gather-sqp95" Apr 16 18:03:10.015701 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.015671 2570 scope.go:117] "RemoveContainer" containerID="c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c" Apr 16 18:03:10.018529 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.018491 2570 status_manager.go:895] "Failed to get status for pod" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" pod="openshift-must-gather-64jgr/must-gather-sqp95" err="pods \"must-gather-sqp95\" is forbidden: User \"system:node:ip-10-0-140-62.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-64jgr\": no relationship found between node 'ip-10-0-140-62.ec2.internal' and this object" Apr 16 18:03:10.022135 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.022120 2570 scope.go:117] "RemoveContainer" containerID="ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f" Apr 16 18:03:10.033403 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.033387 2570 scope.go:117] "RemoveContainer" containerID="c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c" Apr 16 18:03:10.033667 ip-10-0-140-62 kubenswrapper[2570]: E0416 18:03:10.033648 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c\": container with ID starting with c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c not found: ID does not exist" containerID="c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c" Apr 16 18:03:10.033735 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.033675 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c"} err="failed to get container status \"c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c\": rpc error: code = NotFound desc = could not find container \"c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c\": container with ID starting with c61d3d8abfe9b73347064a88199e42362d88861fb99ad7d27d730242d841ad4c not found: ID does not exist" Apr 16 18:03:10.033735 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.033693 2570 scope.go:117] "RemoveContainer" containerID="ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f" Apr 16 18:03:10.033933 ip-10-0-140-62 kubenswrapper[2570]: E0416 18:03:10.033915 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f\": container with ID starting with ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f not found: ID does not exist" containerID="ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f" Apr 16 18:03:10.033978 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.033940 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f"} err="failed to get container status \"ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f\": rpc error: code = NotFound desc = could not find container \"ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f\": container with ID starting with ac44f9b3d03d459f461107755b22529ee05d3afec7616c2abc3c67322a5e455f not found: ID does not exist" Apr 16 18:03:10.151730 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.151675 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5qdq\" (UniqueName: \"kubernetes.io/projected/851e87a9-928c-45f3-b64d-7c1613af030f-kube-api-access-v5qdq\") pod \"851e87a9-928c-45f3-b64d-7c1613af030f\" (UID: \"851e87a9-928c-45f3-b64d-7c1613af030f\") " Apr 16 18:03:10.151730 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.151721 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/851e87a9-928c-45f3-b64d-7c1613af030f-must-gather-output\") pod \"851e87a9-928c-45f3-b64d-7c1613af030f\" (UID: \"851e87a9-928c-45f3-b64d-7c1613af030f\") " Apr 16 18:03:10.152976 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.152950 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/851e87a9-928c-45f3-b64d-7c1613af030f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "851e87a9-928c-45f3-b64d-7c1613af030f" (UID: "851e87a9-928c-45f3-b64d-7c1613af030f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:03:10.153601 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.153567 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/851e87a9-928c-45f3-b64d-7c1613af030f-kube-api-access-v5qdq" (OuterVolumeSpecName: "kube-api-access-v5qdq") pod "851e87a9-928c-45f3-b64d-7c1613af030f" (UID: "851e87a9-928c-45f3-b64d-7c1613af030f"). InnerVolumeSpecName "kube-api-access-v5qdq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:03:10.252850 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.252819 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/851e87a9-928c-45f3-b64d-7c1613af030f-must-gather-output\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 18:03:10.252850 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.252850 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5qdq\" (UniqueName: \"kubernetes.io/projected/851e87a9-928c-45f3-b64d-7c1613af030f-kube-api-access-v5qdq\") on node \"ip-10-0-140-62.ec2.internal\" DevicePath \"\"" Apr 16 18:03:10.325253 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:10.325231 2570 status_manager.go:895] "Failed to get status for pod" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" pod="openshift-must-gather-64jgr/must-gather-sqp95" err="pods \"must-gather-sqp95\" is forbidden: User \"system:node:ip-10-0-140-62.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-64jgr\": no relationship found between node 'ip-10-0-140-62.ec2.internal' and this object" Apr 16 18:03:11.527618 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.527558 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" path="/var/lib/kubelet/pods/851e87a9-928c-45f3-b64d-7c1613af030f/volumes" Apr 16 18:03:11.566745 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.566726 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jlmqv_00e7d4de-424d-4342-802b-f7c98a15bf8b/node-exporter/0.log" Apr 16 18:03:11.587769 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.587750 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jlmqv_00e7d4de-424d-4342-802b-f7c98a15bf8b/kube-rbac-proxy/0.log" Apr 16 18:03:11.608046 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.608029 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jlmqv_00e7d4de-424d-4342-802b-f7c98a15bf8b/init-textfile/0.log" Apr 16 18:03:11.792027 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.791999 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_bca16113-e298-4206-bb9e-2df6ce1ee175/prometheus/0.log" Apr 16 18:03:11.807614 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.807597 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_bca16113-e298-4206-bb9e-2df6ce1ee175/config-reloader/0.log" Apr 16 18:03:11.826125 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.826109 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_bca16113-e298-4206-bb9e-2df6ce1ee175/thanos-sidecar/0.log" Apr 16 18:03:11.844994 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.844976 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_bca16113-e298-4206-bb9e-2df6ce1ee175/kube-rbac-proxy-web/0.log" Apr 16 18:03:11.863526 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.863490 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_bca16113-e298-4206-bb9e-2df6ce1ee175/kube-rbac-proxy/0.log" Apr 16 18:03:11.884269 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.884252 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_bca16113-e298-4206-bb9e-2df6ce1ee175/kube-rbac-proxy-thanos/0.log" Apr 16 18:03:11.903537 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:11.903501 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_bca16113-e298-4206-bb9e-2df6ce1ee175/init-config-reloader/0.log" Apr 16 18:03:12.000901 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:12.000881 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f57875499-n9xmm_8cc2e311-0c6d-40bc-8b60-2efa8925192f/telemeter-client/0.log" Apr 16 18:03:12.020778 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:12.020758 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f57875499-n9xmm_8cc2e311-0c6d-40bc-8b60-2efa8925192f/reload/0.log" Apr 16 18:03:12.048134 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:12.048090 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f57875499-n9xmm_8cc2e311-0c6d-40bc-8b60-2efa8925192f/kube-rbac-proxy/0.log" Apr 16 18:03:13.767284 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:13.767253 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/2.log" Apr 16 18:03:13.771657 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:13.771636 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4977r_7c31261b-9dd1-44b8-b6ad-4092f61c1883/console-operator/3.log" Apr 16 18:03:14.530200 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.530168 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-zqvq9_bae703ff-9612-493f-8d86-c275eac39802/volume-data-source-validator/0.log" Apr 16 18:03:14.861034 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.861003 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz"] Apr 16 18:03:14.861436 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.861424 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" containerName="gather" Apr 16 18:03:14.861501 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.861441 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" containerName="gather" Apr 16 18:03:14.861501 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.861455 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" containerName="copy" Apr 16 18:03:14.861501 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.861464 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" containerName="copy" Apr 16 18:03:14.861675 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.861549 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" containerName="copy" Apr 16 18:03:14.861675 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.861565 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="851e87a9-928c-45f3-b64d-7c1613af030f" containerName="gather" Apr 16 18:03:14.863306 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.863289 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:14.865832 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.865812 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xznzl\"/\"openshift-service-ca.crt\"" Apr 16 18:03:14.865915 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.865812 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xznzl\"/\"default-dockercfg-b8sdr\"" Apr 16 18:03:14.867092 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.867073 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xznzl\"/\"kube-root-ca.crt\"" Apr 16 18:03:14.873777 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.873759 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz"] Apr 16 18:03:14.983019 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.982998 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-podres\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:14.983112 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.983029 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jld9r\" (UniqueName: \"kubernetes.io/projected/6e840063-2867-41ae-b677-d2433240f6f8-kube-api-access-jld9r\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:14.983112 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.983059 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-lib-modules\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:14.983196 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.983131 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-sys\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:14.983196 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:14.983155 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-proc\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.084467 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.084445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-lib-modules\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.084592 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.084484 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-sys\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.084592 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.084530 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-proc\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.084592 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.084576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-podres\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.084727 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.084598 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-sys\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.084727 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.084604 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jld9r\" (UniqueName: \"kubernetes.io/projected/6e840063-2867-41ae-b677-d2433240f6f8-kube-api-access-jld9r\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.084727 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.084644 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-lib-modules\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.084727 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.084678 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-proc\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.084884 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.084757 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6e840063-2867-41ae-b677-d2433240f6f8-podres\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.092546 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.092523 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jld9r\" (UniqueName: \"kubernetes.io/projected/6e840063-2867-41ae-b677-d2433240f6f8-kube-api-access-jld9r\") pod \"perf-node-gather-daemonset-h57vz\" (UID: \"6e840063-2867-41ae-b677-d2433240f6f8\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.172623 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.172576 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:15.242614 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.242592 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-95rfp_cca91f7c-ca0a-4c7a-97bc-93c7e35d6271/dns/0.log" Apr 16 18:03:15.261845 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.261815 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-95rfp_cca91f7c-ca0a-4c7a-97bc-93c7e35d6271/kube-rbac-proxy/0.log" Apr 16 18:03:15.289344 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.289314 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz"] Apr 16 18:03:15.291975 ip-10-0-140-62 kubenswrapper[2570]: W0416 18:03:15.291945 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6e840063_2867_41ae_b677_d2433240f6f8.slice/crio-ca4e22bdb8b7aac9e852f4e86e7d028dc277aac11be357501bc25bb6e0ef4f55 WatchSource:0}: Error finding container ca4e22bdb8b7aac9e852f4e86e7d028dc277aac11be357501bc25bb6e0ef4f55: Status 404 returned error can't find the container with id ca4e22bdb8b7aac9e852f4e86e7d028dc277aac11be357501bc25bb6e0ef4f55 Apr 16 18:03:15.313577 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.313560 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-j272p_25d3965d-906a-4df5-bec1-9edc3c2a8a64/dns-node-resolver/0.log" Apr 16 18:03:15.728524 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.728475 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6cb5b774b5-jf8zl_640fee70-6120-4a2c-9c86-bec1973f853e/registry/0.log" Apr 16 18:03:15.794779 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:15.794755 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x6c67_aae5a78d-4165-4422-a612-17627616235f/node-ca/0.log" Apr 16 18:03:16.034107 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:16.034084 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" event={"ID":"6e840063-2867-41ae-b677-d2433240f6f8","Type":"ContainerStarted","Data":"06fe5c0d35f3dd3004b4033ca3b4f40e0810bf75a9393cfa0c43e037e4f1add0"} Apr 16 18:03:16.034414 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:16.034112 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" event={"ID":"6e840063-2867-41ae-b677-d2433240f6f8","Type":"ContainerStarted","Data":"ca4e22bdb8b7aac9e852f4e86e7d028dc277aac11be357501bc25bb6e0ef4f55"} Apr 16 18:03:16.034414 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:16.034218 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:16.050236 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:16.050198 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" podStartSLOduration=2.050185993 podStartE2EDuration="2.050185993s" podCreationTimestamp="2026-04-16 18:03:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:16.049719946 +0000 UTC m=+1311.129941074" watchObservedRunningTime="2026-04-16 18:03:16.050185993 +0000 UTC m=+1311.130407121" Apr 16 18:03:16.812368 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:16.812347 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4ctsq_64a21724-619d-4a2f-b61f-bf293308211b/serve-healthcheck-canary/0.log" Apr 16 18:03:17.222883 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:17.222824 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2bgq6_d9f7e926-aa6c-4880-8031-bdee5bf28608/kube-rbac-proxy/0.log" Apr 16 18:03:17.246458 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:17.246440 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2bgq6_d9f7e926-aa6c-4880-8031-bdee5bf28608/exporter/0.log" Apr 16 18:03:17.266157 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:17.266134 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2bgq6_d9f7e926-aa6c-4880-8031-bdee5bf28608/extractor/0.log" Apr 16 18:03:19.546880 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:19.546855 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-jbqzb_ce5d7dd7-aa5c-4bbe-8a32-4df1d77a9aed/s3-init/0.log" Apr 16 18:03:22.045747 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:22.045719 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-h57vz" Apr 16 18:03:23.078604 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:23.078572 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-9b68m_203a89e4-4618-4411-aa1c-395cb2b30306/migrator/0.log" Apr 16 18:03:23.097550 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:23.097531 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-9b68m_203a89e4-4618-4411-aa1c-395cb2b30306/graceful-termination/0.log" Apr 16 18:03:24.480847 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:24.480787 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdzwz_29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8/kube-multus-additional-cni-plugins/0.log" Apr 16 18:03:24.503459 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:24.503442 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdzwz_29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8/egress-router-binary-copy/0.log" Apr 16 18:03:24.527157 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:24.527135 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdzwz_29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8/cni-plugins/0.log" Apr 16 18:03:24.550878 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:24.547113 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdzwz_29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8/bond-cni-plugin/0.log" Apr 16 18:03:24.566396 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:24.566372 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdzwz_29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8/routeoverride-cni/0.log" Apr 16 18:03:24.584695 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:24.584678 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdzwz_29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8/whereabouts-cni-bincopy/0.log" Apr 16 18:03:24.603414 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:24.603397 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdzwz_29a15c3f-ab4c-4d8b-9a95-9ca90ba23bd8/whereabouts-cni/0.log" Apr 16 18:03:24.950682 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:24.950662 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7mp6_fb67ec76-fe28-444f-b4f0-51430f30c713/kube-multus/0.log" Apr 16 18:03:25.038279 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:25.038261 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l7h7z_38d21ef6-c2df-4bbd-8185-bf4fff5cb835/network-metrics-daemon/0.log" Apr 16 18:03:25.055718 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:25.055685 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l7h7z_38d21ef6-c2df-4bbd-8185-bf4fff5cb835/kube-rbac-proxy/0.log" Apr 16 18:03:26.179909 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:26.179885 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-controller/0.log" Apr 16 18:03:26.198311 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:26.198275 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/0.log" Apr 16 18:03:26.204709 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:26.204691 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovn-acl-logging/1.log" Apr 16 18:03:26.223054 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:26.223039 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/kube-rbac-proxy-node/0.log" Apr 16 18:03:26.242653 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:26.242634 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:03:26.259484 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:26.259468 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/northd/0.log" Apr 16 18:03:26.278434 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:26.278412 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/nbdb/0.log" Apr 16 18:03:26.297271 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:26.297255 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/sbdb/0.log" Apr 16 18:03:26.383881 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:26.383837 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9mxm_6f384fae-bee8-46b2-8fd3-71f7ece4b87e/ovnkube-controller/0.log" Apr 16 18:03:27.557266 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:27.557233 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5rkjf_26163ff9-2d96-4401-962b-735123e76554/network-check-target-container/0.log" Apr 16 18:03:28.467196 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:28.467172 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7lffp_0321b210-b7f5-4cfe-9d6d-318f4ba3d299/iptables-alerter/0.log" Apr 16 18:03:29.145926 ip-10-0-140-62 kubenswrapper[2570]: I0416 18:03:29.145902 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-gjbj8_9a25f350-b652-47af-8404-87e373883218/tuned/0.log"