Apr 23 09:28:20.650131 ip-10-0-140-162 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 09:28:20.650142 ip-10-0-140-162 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 09:28:20.650149 ip-10-0-140-162 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 09:28:20.650366 ip-10-0-140-162 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 09:28:30.684228 ip-10-0-140-162 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 09:28:30.684244 ip-10-0-140-162 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 248345b9a391445cba35c6b78188fb97 -- Apr 23 09:30:41.717200 ip-10-0-140-162 systemd[1]: Starting Kubernetes Kubelet... Apr 23 09:30:42.179859 ip-10-0-140-162 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 09:30:42.179859 ip-10-0-140-162 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 09:30:42.179859 ip-10-0-140-162 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 09:30:42.179859 ip-10-0-140-162 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 09:30:42.179859 ip-10-0-140-162 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 09:30:42.180933 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.180835 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 09:30:42.184225 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184206 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:42.184225 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184224 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:42.184225 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184230 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184234 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184239 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184243 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184247 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184251 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184255 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184259 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184262 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184266 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184270 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184274 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184279 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184283 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184286 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184290 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184302 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184306 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184310 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184314 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:42.184436 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184319 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184324 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184328 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184331 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184335 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184339 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184343 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184347 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184352 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184356 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184360 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184363 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184368 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184372 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184377 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184382 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184387 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184392 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184395 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184399 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:42.185290 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184406 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184410 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184414 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184418 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184423 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184427 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184431 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184435 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184438 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184443 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184448 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184452 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184456 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184460 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184465 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184469 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184473 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184478 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184482 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184486 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:42.185995 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184490 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184494 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184498 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184502 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184507 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184514 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184522 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184527 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184532 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184536 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184541 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184545 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184550 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184554 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184558 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184562 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184570 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184575 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:42.186600 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184580 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184585 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184589 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184594 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184599 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.184603 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185241 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185250 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185255 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185260 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185264 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185268 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185273 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185277 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185281 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185286 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185289 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185294 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185297 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:42.187382 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185301 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185308 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185315 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185323 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185328 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185333 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185338 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185342 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185346 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185350 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185354 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185358 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185362 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185366 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185370 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185375 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185380 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185386 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185390 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:42.188088 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185394 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185398 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185402 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185406 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185410 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185415 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185419 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185423 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185427 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185432 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185436 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185440 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185444 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185449 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185455 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185460 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185464 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185468 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185473 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185477 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:42.188559 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185482 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185486 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185491 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185495 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185499 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185503 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185507 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185511 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185515 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185519 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185522 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185527 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185532 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185536 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185540 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185544 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185548 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185552 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185557 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185561 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:42.189185 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185565 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185569 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185574 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185578 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185582 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185587 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185591 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185595 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185598 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185602 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185606 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185610 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185614 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.185618 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186376 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186393 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186404 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186412 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186418 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186423 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186431 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 09:30:42.189890 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186437 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186447 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186453 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186459 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186464 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186469 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186474 2577 flags.go:64] FLAG: --cgroup-root="" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186478 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186483 2577 flags.go:64] FLAG: --client-ca-file="" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186488 2577 flags.go:64] FLAG: --cloud-config="" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186493 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186498 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186504 2577 flags.go:64] FLAG: --cluster-domain="" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186509 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186514 2577 flags.go:64] FLAG: --config-dir="" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186518 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186523 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186530 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186535 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186540 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186546 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186551 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186555 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186561 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186565 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 09:30:42.190442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186570 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186578 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186584 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186588 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186593 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186598 2577 flags.go:64] FLAG: --enable-server="true" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186602 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186610 2577 flags.go:64] FLAG: --event-burst="100" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186616 2577 flags.go:64] FLAG: --event-qps="50" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186621 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186626 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186631 2577 flags.go:64] FLAG: --eviction-hard="" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186637 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186642 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186647 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186652 2577 flags.go:64] FLAG: --eviction-soft="" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186657 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186662 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186666 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186671 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186676 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186680 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186685 2577 flags.go:64] FLAG: --feature-gates="" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186691 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186696 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 09:30:42.191154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186701 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186706 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186711 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186716 2577 flags.go:64] FLAG: --help="false" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186721 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186726 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186731 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186736 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186741 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186747 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186751 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186756 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186761 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186765 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186788 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186795 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186800 2577 flags.go:64] FLAG: --kube-reserved="" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186805 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186809 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186815 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186820 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186825 2577 flags.go:64] FLAG: --lock-file="" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186829 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186834 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 09:30:42.191790 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186839 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186848 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186853 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186857 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186862 2577 flags.go:64] FLAG: --logging-format="text" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186866 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186872 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186877 2577 flags.go:64] FLAG: --manifest-url="" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186881 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186888 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186893 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186899 2577 flags.go:64] FLAG: --max-pods="110" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186904 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186909 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186914 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186919 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186924 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186929 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186933 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186945 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186950 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186955 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186960 2577 flags.go:64] FLAG: --pod-cidr="" Apr 23 09:30:42.192382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186966 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186975 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186980 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186985 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186989 2577 flags.go:64] FLAG: --port="10250" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.186995 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187000 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06e020e6b736677c0" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187006 2577 flags.go:64] FLAG: --qos-reserved="" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187010 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187015 2577 flags.go:64] FLAG: --register-node="true" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187020 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187024 2577 flags.go:64] FLAG: --register-with-taints="" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187031 2577 flags.go:64] FLAG: --registry-burst="10" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187036 2577 flags.go:64] FLAG: --registry-qps="5" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187040 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187045 2577 flags.go:64] FLAG: --reserved-memory="" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187051 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187056 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187061 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187066 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187070 2577 flags.go:64] FLAG: --runonce="false" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187075 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187080 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187085 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187090 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187095 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 09:30:42.192950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187100 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187104 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187110 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187114 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187119 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187124 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187129 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187135 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187139 2577 flags.go:64] FLAG: --system-cgroups="" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187144 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187152 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187157 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187163 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187177 2577 flags.go:64] FLAG: --tls-min-version="" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187182 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187187 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187192 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187197 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187202 2577 flags.go:64] FLAG: --v="2" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187209 2577 flags.go:64] FLAG: --version="false" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187215 2577 flags.go:64] FLAG: --vmodule="" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187221 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.187227 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187388 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:42.193566 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187394 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187399 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187403 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187408 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187413 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187418 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187422 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187427 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187432 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187436 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187440 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187445 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187449 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187453 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187457 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187463 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187467 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187472 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187476 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187480 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:42.194207 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187484 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187489 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187493 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187497 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187502 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187506 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187510 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187514 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187518 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187522 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187529 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187534 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187538 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187542 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187546 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187551 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187554 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187558 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187562 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187567 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:42.195023 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187571 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187574 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187578 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187582 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187586 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187590 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187596 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187601 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187605 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187609 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187613 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187618 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187621 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187625 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187630 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187634 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187639 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187643 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187647 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:42.195700 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187652 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187656 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187660 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187665 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187669 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187674 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187678 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187682 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187686 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187690 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187694 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187699 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187702 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187706 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187710 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187714 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187718 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187722 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187729 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:42.196196 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187734 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:42.196659 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187740 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:42.196659 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187745 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:42.196659 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187751 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:42.196659 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187757 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:42.196659 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187761 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:42.196659 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.187766 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:42.196659 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.188805 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 09:30:42.196920 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.196802 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 09:30:42.196952 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.196923 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 09:30:42.196981 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.196973 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:42.196981 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.196979 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.196983 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.196986 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.196989 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.196992 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.196996 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197000 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197003 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197006 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197009 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197012 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197014 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197018 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197020 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197023 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197026 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197029 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197031 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197034 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197037 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:42.197036 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197040 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197043 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197045 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197049 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197052 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197055 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197057 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197060 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197062 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197065 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197068 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197070 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197072 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197075 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197077 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197080 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197082 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197085 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197087 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197090 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:42.197520 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197092 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197095 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197097 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197099 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197102 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197105 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197107 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197109 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197112 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197114 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197117 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197119 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197121 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197124 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197127 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197130 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197132 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197135 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197137 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:42.198025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197140 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197142 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197145 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197147 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197150 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197152 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197155 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197157 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197160 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197163 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197165 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197167 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197172 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197176 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197179 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197182 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197185 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197188 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197191 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:42.198522 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197193 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197195 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197198 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197200 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197203 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197205 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197208 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.197213 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197315 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197321 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197325 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197328 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197331 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197334 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197337 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:42.199025 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197339 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197342 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197345 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197348 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197351 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197353 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197356 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197359 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197361 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197364 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197366 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197369 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197371 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197374 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197376 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197379 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197383 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197386 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197388 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:42.199398 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197391 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197394 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197396 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197399 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197401 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197404 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197407 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197410 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197412 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197415 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197417 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197420 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197422 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197424 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197427 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197430 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197432 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197435 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197437 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197439 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:42.199892 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197441 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197444 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197446 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197449 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197451 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197454 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197456 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197458 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197461 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197464 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197466 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197469 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197471 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197473 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197476 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197479 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197481 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197484 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197487 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197489 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:42.200411 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197492 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197494 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197497 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197499 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197501 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197504 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197506 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197509 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197511 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197514 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197517 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197520 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197523 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197526 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197529 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197532 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197535 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197538 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197541 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:42.200953 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:42.197543 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:42.201410 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.197548 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 09:30:42.201410 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.198277 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 09:30:42.201410 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.200295 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 09:30:42.201410 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.201338 2577 server.go:1019] "Starting client certificate rotation" Apr 23 09:30:42.201534 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.201428 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 09:30:42.201534 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.201472 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 09:30:42.225641 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.225619 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 09:30:42.230294 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.230275 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 09:30:42.247445 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.244980 2577 log.go:25] "Validated CRI v1 runtime API" Apr 23 09:30:42.250680 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.250666 2577 log.go:25] "Validated CRI v1 image API" Apr 23 09:30:42.251988 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.251973 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 09:30:42.256658 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.256626 2577 fs.go:135] Filesystem UUIDs: map[0e87485d-106a-4049-a927-6bfe6a8a3eb3:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c86de7eb-2a2d-4f89-88ff-0aeea253b9cf:/dev/nvme0n1p4] Apr 23 09:30:42.256658 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.256650 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 09:30:42.262372 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.262256 2577 manager.go:217] Machine: {Timestamp:2026-04-23 09:30:42.260246094 +0000 UTC m=+0.415341054 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092665 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2cebbdf2bdd33a450e4d784e340100 SystemUUID:ec2cebbd-f2bd-d33a-450e-4d784e340100 BootID:248345b9-a391-445c-ba35-c6b78188fb97 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b7:2d:ae:9e:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b7:2d:ae:9e:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:46:14:f9:5a:a2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 09:30:42.262372 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.262361 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 09:30:42.262509 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.262480 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 09:30:42.262896 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.262880 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 09:30:42.265022 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.264995 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 09:30:42.265165 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.265025 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-162.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 09:30:42.265218 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.265174 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 09:30:42.265218 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.265183 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 09:30:42.265218 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.265195 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 09:30:42.265931 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.265917 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 09:30:42.266741 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.266732 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 23 09:30:42.266873 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.266864 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 09:30:42.269421 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.269411 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 23 09:30:42.269507 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.269426 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 09:30:42.269507 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.269438 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 09:30:42.269507 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.269448 2577 kubelet.go:397] "Adding apiserver pod source" Apr 23 09:30:42.269507 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.269456 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 09:30:42.270649 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.270636 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 09:30:42.270724 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.270654 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 09:30:42.274030 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.274013 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 09:30:42.275446 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.275404 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 09:30:42.277128 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277113 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 09:30:42.277128 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277130 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277136 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277142 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277148 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277153 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277159 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277164 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277171 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277177 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277186 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 09:30:42.277226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.277195 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 09:30:42.278203 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.278190 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 09:30:42.278249 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.278206 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 09:30:42.280469 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.280443 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-162.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 09:30:42.280514 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.280466 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 09:30:42.282365 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.282352 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 09:30:42.282446 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.282394 2577 server.go:1295] "Started kubelet" Apr 23 09:30:42.282547 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.282520 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 09:30:42.282580 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.282514 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 09:30:42.282608 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.282580 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 09:30:42.283314 ip-10-0-140-162 systemd[1]: Started Kubernetes Kubelet. Apr 23 09:30:42.283859 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.283622 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 09:30:42.285110 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.285093 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7lfzp" Apr 23 09:30:42.285181 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.285162 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 23 09:30:42.289825 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.289805 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 09:30:42.290516 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.290497 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 09:30:42.291176 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.291163 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 09:30:42.291442 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.291417 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:42.292339 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.292313 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 09:30:42.292339 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.292342 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 09:30:42.292485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.292471 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 23 09:30:42.292485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.292486 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 23 09:30:42.292631 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.292616 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7lfzp" Apr 23 09:30:42.293667 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.293636 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-162.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 09:30:42.293766 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.293747 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 09:30:42.293838 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.293809 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-162.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 09:30:42.294330 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.294314 2577 factory.go:55] Registering systemd factory Apr 23 09:30:42.294407 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.294341 2577 factory.go:223] Registration of the systemd container factory successfully Apr 23 09:30:42.294551 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.294539 2577 factory.go:153] Registering CRI-O factory Apr 23 09:30:42.294592 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.294556 2577 factory.go:223] Registration of the crio container factory successfully Apr 23 09:30:42.294625 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.294608 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 09:30:42.294665 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.294637 2577 factory.go:103] Registering Raw factory Apr 23 09:30:42.294665 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.294652 2577 manager.go:1196] Started watching for new ooms in manager Apr 23 09:30:42.294889 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.293896 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-162.ec2.internal.18a8f276279d9c26 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-162.ec2.internal,UID:ip-10-0-140-162.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-162.ec2.internal,},FirstTimestamp:2026-04-23 09:30:42.282363942 +0000 UTC m=+0.437458902,LastTimestamp:2026-04-23 09:30:42.282363942 +0000 UTC m=+0.437458902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-162.ec2.internal,}" Apr 23 09:30:42.294983 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.294944 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 09:30:42.295154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.295132 2577 manager.go:319] Starting recovery of all containers Apr 23 09:30:42.303979 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.303959 2577 manager.go:324] Recovery completed Apr 23 09:30:42.308037 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.308023 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:42.310468 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.310453 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:42.310514 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.310484 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:42.310514 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.310499 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:42.311032 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.311019 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 09:30:42.311094 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.311033 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 09:30:42.311094 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.311065 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 23 09:30:42.313422 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.313411 2577 policy_none.go:49] "None policy: Start" Apr 23 09:30:42.313461 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.313427 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 09:30:42.313461 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.313436 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 23 09:30:42.364360 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.364343 2577 manager.go:341] "Starting Device Plugin manager" Apr 23 09:30:42.376098 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.364403 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 09:30:42.376098 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.364417 2577 server.go:85] "Starting device plugin registration server" Apr 23 09:30:42.376098 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.364671 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 09:30:42.376098 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.364682 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 09:30:42.376098 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.364788 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 09:30:42.376098 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.364865 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 09:30:42.376098 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.364873 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 09:30:42.376098 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.368049 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 09:30:42.376098 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.368082 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:42.402161 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.402138 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 09:30:42.403514 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.403500 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 09:30:42.403584 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.403525 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 09:30:42.403584 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.403543 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 09:30:42.403584 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.403556 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 09:30:42.403707 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.403640 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 09:30:42.405972 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.405953 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:42.465039 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.464944 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:42.466045 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.466029 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:42.466127 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.466057 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:42.466127 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.466067 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:42.466127 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.466090 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.475169 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.475151 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.475217 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.475173 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-162.ec2.internal\": node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:42.485616 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.485595 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:42.504269 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.504245 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal"] Apr 23 09:30:42.504366 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.504303 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:42.505156 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.505142 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:42.505218 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.505169 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:42.505218 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.505179 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:42.506240 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.506229 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:42.506379 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.506364 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.506414 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.506399 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:42.506973 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.506957 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:42.507055 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.506989 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:42.507055 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.506957 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:42.507055 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.507005 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:42.507055 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.507013 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:42.507055 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.507026 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:42.508247 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.508234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.508302 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.508259 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:42.509509 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.509493 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:42.509574 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.509526 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:42.509574 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.509540 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:42.529444 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.529424 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-162.ec2.internal\" not found" node="ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.533885 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.533870 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-162.ec2.internal\" not found" node="ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.586341 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.586318 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:42.594025 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.594006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d85ba369ef6b926856de0defa6be0a9e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-162.ec2.internal\" (UID: \"d85ba369ef6b926856de0defa6be0a9e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.594084 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.594031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8dc30e02f5cc8b2c2c7c8ea212f0c18d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal\" (UID: \"8dc30e02f5cc8b2c2c7c8ea212f0c18d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.594084 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.594051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8dc30e02f5cc8b2c2c7c8ea212f0c18d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal\" (UID: \"8dc30e02f5cc8b2c2c7c8ea212f0c18d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.687097 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.687058 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:42.694401 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.694380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d85ba369ef6b926856de0defa6be0a9e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-162.ec2.internal\" (UID: \"d85ba369ef6b926856de0defa6be0a9e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.694450 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.694409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8dc30e02f5cc8b2c2c7c8ea212f0c18d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal\" (UID: \"8dc30e02f5cc8b2c2c7c8ea212f0c18d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.694450 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.694428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8dc30e02f5cc8b2c2c7c8ea212f0c18d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal\" (UID: \"8dc30e02f5cc8b2c2c7c8ea212f0c18d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.694508 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.694476 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8dc30e02f5cc8b2c2c7c8ea212f0c18d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal\" (UID: \"8dc30e02f5cc8b2c2c7c8ea212f0c18d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.694508 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.694480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d85ba369ef6b926856de0defa6be0a9e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-162.ec2.internal\" (UID: \"d85ba369ef6b926856de0defa6be0a9e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.694508 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.694488 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8dc30e02f5cc8b2c2c7c8ea212f0c18d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal\" (UID: \"8dc30e02f5cc8b2c2c7c8ea212f0c18d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.787884 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.787814 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:42.831387 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.831361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.836906 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:42.836886 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" Apr 23 09:30:42.888336 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.888304 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:42.988827 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:42.988799 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:43.089463 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:43.089374 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:43.190062 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:43.190030 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:43.191096 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.191081 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:43.201031 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.201011 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 09:30:43.201137 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.201122 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 09:30:43.201178 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.201160 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 09:30:43.290696 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.290664 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 09:30:43.290696 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:43.290665 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:43.294724 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.294693 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 09:25:42 +0000 UTC" deadline="2027-11-25 12:21:50.584909233 +0000 UTC" Apr 23 09:30:43.294808 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.294726 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13946h51m7.290187358s" Apr 23 09:30:43.311386 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.311354 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 09:30:43.336764 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.336743 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-r2266" Apr 23 09:30:43.340131 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:43.339940 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dc30e02f5cc8b2c2c7c8ea212f0c18d.slice/crio-6ca289a4fef26c1d4eadbaa1e6d8a8a3b6554d8b3da5ee29f1df67d81f596a6a WatchSource:0}: Error finding container 6ca289a4fef26c1d4eadbaa1e6d8a8a3b6554d8b3da5ee29f1df67d81f596a6a: Status 404 returned error can't find the container with id 6ca289a4fef26c1d4eadbaa1e6d8a8a3b6554d8b3da5ee29f1df67d81f596a6a Apr 23 09:30:43.340657 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:43.340630 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd85ba369ef6b926856de0defa6be0a9e.slice/crio-e7354d0118b641c074e159bbd8bc37d7d95fd547b7936486ce87261ca7a635f9 WatchSource:0}: Error finding container e7354d0118b641c074e159bbd8bc37d7d95fd547b7936486ce87261ca7a635f9: Status 404 returned error can't find the container with id e7354d0118b641c074e159bbd8bc37d7d95fd547b7936486ce87261ca7a635f9 Apr 23 09:30:43.344946 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.344929 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-r2266" Apr 23 09:30:43.345170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.345156 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:30:43.370456 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.370429 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:43.391427 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:43.391403 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:43.406501 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.406449 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal" event={"ID":"d85ba369ef6b926856de0defa6be0a9e","Type":"ContainerStarted","Data":"e7354d0118b641c074e159bbd8bc37d7d95fd547b7936486ce87261ca7a635f9"} Apr 23 09:30:43.407336 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.407316 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" event={"ID":"8dc30e02f5cc8b2c2c7c8ea212f0c18d","Type":"ContainerStarted","Data":"6ca289a4fef26c1d4eadbaa1e6d8a8a3b6554d8b3da5ee29f1df67d81f596a6a"} Apr 23 09:30:43.491596 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:43.491569 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:43.592088 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:43.592009 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:43.692468 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:43.692435 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-162.ec2.internal\" not found" Apr 23 09:30:43.741335 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.741307 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:43.791350 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.791325 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal" Apr 23 09:30:43.808308 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.808282 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 09:30:43.809357 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.809337 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" Apr 23 09:30:43.821083 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:43.821051 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 09:30:44.056243 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.056156 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:44.270836 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.270800 2577 apiserver.go:52] "Watching apiserver" Apr 23 09:30:44.280380 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.280192 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 09:30:44.280566 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.280545 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal","openshift-multus/multus-additional-cni-plugins-hktck","openshift-multus/multus-stttf","openshift-multus/network-metrics-daemon-zh888","kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal","openshift-cluster-node-tuning-operator/tuned-v6fd9","openshift-image-registry/node-ca-bcm2p","openshift-network-diagnostics/network-check-target-jrn69","openshift-network-operator/iptables-alerter-lhfct","openshift-ovn-kubernetes/ovnkube-node-54r9n","kube-system/konnectivity-agent-gkkhl"] Apr 23 09:30:44.282799 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.282760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.283820 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.283799 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.284714 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.284696 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-stttf" Apr 23 09:30:44.285519 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.285327 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 09:30:44.285519 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.285332 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kwc2k\"" Apr 23 09:30:44.285519 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.285471 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 09:30:44.285685 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.285611 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 09:30:44.285899 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.285871 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:44.285997 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.285972 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 09:30:44.285997 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.285968 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:30:44.286300 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.286272 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 09:30:44.286390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.286313 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fcwn7\"" Apr 23 09:30:44.286390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.286367 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 09:30:44.286498 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.286403 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 09:30:44.286548 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.286518 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 09:30:44.286684 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.286669 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 09:30:44.287122 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.287099 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.287210 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.287132 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6tbpg\"" Apr 23 09:30:44.288044 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.288025 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.289008 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.288987 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qxzzp\"" Apr 23 09:30:44.289008 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.289002 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:30:44.289146 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.288990 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 09:30:44.289328 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.289313 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:44.289407 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.289385 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:30:44.289760 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.289743 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 09:30:44.290004 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.289985 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 09:30:44.290072 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.290048 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-57kc9\"" Apr 23 09:30:44.290277 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.290260 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 09:30:44.290492 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.290476 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.292311 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.292292 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 09:30:44.292655 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.292621 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:30:44.292856 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.292840 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 09:30:44.292986 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.292970 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fqbwj\"" Apr 23 09:30:44.293666 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.293640 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.294999 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.294970 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:30:44.295848 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.295828 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dpzbh\"" Apr 23 09:30:44.295962 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.295910 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 09:30:44.296027 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.296005 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 09:30:44.296027 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.296010 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 09:30:44.296116 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.295914 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 09:30:44.296168 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.296151 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 09:30:44.296572 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.296552 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 09:30:44.296989 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.296973 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 09:30:44.297634 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.297609 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-g65s2\"" Apr 23 09:30:44.297726 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.297714 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 09:30:44.300966 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.300945 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-modprobe-d\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.301057 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.300979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-run\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.301057 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301008 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-lib-modules\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.301057 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301036 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-var-lib-cni-bin\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.301206 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301094 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.301206 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be1c69a7-597d-45d8-8fb3-502373eacdd2-cni-binary-copy\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.301206 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301182 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-socket-dir-parent\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.301349 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301217 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d8t4\" (UniqueName: \"kubernetes.io/projected/f15f26f5-0cd1-4c82-8351-806c77b25958-kube-api-access-4d8t4\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.301349 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-node-log\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.301349 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72da84fe-89a1-4d25-9ab3-667e195d5b53-ovnkube-script-lib\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.301349 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.301349 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-var-lib-cni-multus\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.301550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-sysconfig\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.301550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-host\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.301550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-run-ovn\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.301550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-run-k8s-cni-cncf-io\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.301550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h6w6\" (UniqueName: \"kubernetes.io/projected/be1c69a7-597d-45d8-8fb3-502373eacdd2-kube-api-access-5h6w6\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.301550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301538 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-sys\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.301754 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-tuned\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.301754 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-run-systemd\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.301754 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301606 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-etc-openvswitch\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.301754 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-run-openvswitch\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.301754 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-registration-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.301754 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-cni-dir\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.301754 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-run-netns\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.302059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-var-lib-kubelet\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.302059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-hostroot\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.302059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j42j5\" (UniqueName: \"kubernetes.io/projected/5b48044a-70f3-4dd3-b5c5-d35b911078ea-kube-api-access-j42j5\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:44.302059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5b5b1ef2-e4bf-4e9d-99fd-2cf856085144-host-slash\") pod \"iptables-alerter-lhfct\" (UID: \"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144\") " pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.302059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-device-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.302059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301955 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f15f26f5-0cd1-4c82-8351-806c77b25958-tmp\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.302059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.301987 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-etc-selinux\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.302059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9h2p\" (UniqueName: \"kubernetes.io/projected/5b5b1ef2-e4bf-4e9d-99fd-2cf856085144-kube-api-access-f9h2p\") pod \"iptables-alerter-lhfct\" (UID: \"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144\") " pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.302059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-kubernetes\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302075 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-kubelet\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302102 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-system-cni-dir\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9617f384-f79c-4e0b-86a8-23ef327b0e85-cni-binary-copy\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-systemd-units\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72da84fe-89a1-4d25-9ab3-667e195d5b53-env-overrides\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-socket-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-sys-fs\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-cnibin\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302407 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:44.302452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9caa8b02-3e8a-47d0-9956-0368d378aa04-host\") pod \"node-ca-bcm2p\" (UID: \"9caa8b02-3e8a-47d0-9956-0368d378aa04\") " pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.302934 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302459 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd9dd\" (UniqueName: \"kubernetes.io/projected/9caa8b02-3e8a-47d0-9956-0368d378aa04-kube-api-access-qd9dd\") pod \"node-ca-bcm2p\" (UID: \"9caa8b02-3e8a-47d0-9956-0368d378aa04\") " pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.302934 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-sysctl-d\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.302934 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8wgx\" (UniqueName: \"kubernetes.io/projected/72da84fe-89a1-4d25-9ab3-667e195d5b53-kube-api-access-b8wgx\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.302934 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.302572 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-os-release\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.303119 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303068 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.303154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303119 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9617f384-f79c-4e0b-86a8-23ef327b0e85-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.303206 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303167 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72da84fe-89a1-4d25-9ab3-667e195d5b53-ovnkube-config\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.303206 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303201 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-run-netns\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.303281 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303228 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72da84fe-89a1-4d25-9ab3-667e195d5b53-ovn-node-metrics-cert\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.303281 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b69qg\" (UniqueName: \"kubernetes.io/projected/53dfc537-d64d-46ce-8310-4e177cebb95f-kube-api-access-b69qg\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.303281 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-run-multus-certs\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.303397 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303322 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9617f384-f79c-4e0b-86a8-23ef327b0e85-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.303450 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-sysctl-conf\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.303499 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-systemd\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.303563 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-slash\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.303652 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-log-socket\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.303652 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303615 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-cni-bin\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.303652 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303646 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-daemon-config\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.304133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvh9l\" (UniqueName: \"kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l\") pod \"network-check-target-jrn69\" (UID: \"a3241665-735b-450b-a779-0fc671dcc4d0\") " pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:44.304133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5b5b1ef2-e4bf-4e9d-99fd-2cf856085144-iptables-alerter-script\") pod \"iptables-alerter-lhfct\" (UID: \"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144\") " pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.304133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-var-lib-kubelet\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.304133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-cni-netd\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.304133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-os-release\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.304133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.303905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-conf-dir\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.304133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.304042 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-etc-kubernetes\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.304133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.304089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-system-cni-dir\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.304133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.304121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-cnibin\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.305255 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.304151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kgfh\" (UniqueName: \"kubernetes.io/projected/9617f384-f79c-4e0b-86a8-23ef327b0e85-kube-api-access-9kgfh\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.305255 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.304204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-var-lib-openvswitch\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.305255 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.304232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-run-ovn-kubernetes\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.305255 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.304250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9caa8b02-3e8a-47d0-9956-0368d378aa04-serviceca\") pod \"node-ca-bcm2p\" (UID: \"9caa8b02-3e8a-47d0-9956-0368d378aa04\") " pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.346249 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.346222 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 09:25:43 +0000 UTC" deadline="2027-11-19 01:48:43.673260522 +0000 UTC" Apr 23 09:30:44.346249 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.346249 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13792h17m59.327014926s" Apr 23 09:30:44.392140 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.392115 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 09:30:44.405306 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-kubernetes\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.405306 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-kubelet\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-system-cni-dir\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9617f384-f79c-4e0b-86a8-23ef327b0e85-cni-binary-copy\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-kubernetes\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405378 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-systemd-units\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72da84fe-89a1-4d25-9ab3-667e195d5b53-env-overrides\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-socket-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-sys-fs\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-cnibin\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9caa8b02-3e8a-47d0-9956-0368d378aa04-host\") pod \"node-ca-bcm2p\" (UID: \"9caa8b02-3e8a-47d0-9956-0368d378aa04\") " pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.405525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd9dd\" (UniqueName: \"kubernetes.io/projected/9caa8b02-3e8a-47d0-9956-0368d378aa04-kube-api-access-qd9dd\") pod \"node-ca-bcm2p\" (UID: \"9caa8b02-3e8a-47d0-9956-0368d378aa04\") " pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-sysctl-d\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8wgx\" (UniqueName: \"kubernetes.io/projected/72da84fe-89a1-4d25-9ab3-667e195d5b53-kube-api-access-b8wgx\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-os-release\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9617f384-f79c-4e0b-86a8-23ef327b0e85-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72da84fe-89a1-4d25-9ab3-667e195d5b53-ovnkube-config\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-run-netns\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72da84fe-89a1-4d25-9ab3-667e195d5b53-ovn-node-metrics-cert\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b69qg\" (UniqueName: \"kubernetes.io/projected/53dfc537-d64d-46ce-8310-4e177cebb95f-kube-api-access-b69qg\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-run-multus-certs\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9617f384-f79c-4e0b-86a8-23ef327b0e85-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405769 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-sysctl-conf\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-systemd\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-slash\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-log-socket\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-cni-bin\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.405915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-daemon-config\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh9l\" (UniqueName: \"kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l\") pod \"network-check-target-jrn69\" (UID: \"a3241665-735b-450b-a779-0fc671dcc4d0\") " pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5b5b1ef2-e4bf-4e9d-99fd-2cf856085144-iptables-alerter-script\") pod \"iptables-alerter-lhfct\" (UID: \"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144\") " pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-var-lib-kubelet\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-cni-netd\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-os-release\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-conf-dir\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-etc-kubernetes\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-system-cni-dir\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-cnibin\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kgfh\" (UniqueName: \"kubernetes.io/projected/9617f384-f79c-4e0b-86a8-23ef327b0e85-kube-api-access-9kgfh\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9617f384-f79c-4e0b-86a8-23ef327b0e85-cni-binary-copy\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-var-lib-openvswitch\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-run-ovn-kubernetes\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9caa8b02-3e8a-47d0-9956-0368d378aa04-serviceca\") pod \"node-ca-bcm2p\" (UID: \"9caa8b02-3e8a-47d0-9956-0368d378aa04\") " pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-modprobe-d\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-run\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.406485 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-kubelet\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-lib-modules\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-var-lib-cni-bin\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406190 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-os-release\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be1c69a7-597d-45d8-8fb3-502373eacdd2-cni-binary-copy\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-socket-dir-parent\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d8t4\" (UniqueName: \"kubernetes.io/projected/f15f26f5-0cd1-4c82-8351-806c77b25958-kube-api-access-4d8t4\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-socket-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-node-log\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72da84fe-89a1-4d25-9ab3-667e195d5b53-ovnkube-script-lib\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-sys-fs\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406290 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72da84fe-89a1-4d25-9ab3-667e195d5b53-env-overrides\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-var-lib-cni-multus\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-var-lib-cni-multus\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8cc7fa9d-ac09-438d-a8a9-392735fe315f-agent-certs\") pod \"konnectivity-agent-gkkhl\" (UID: \"8cc7fa9d-ac09-438d-a8a9-392735fe315f\") " pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:30:44.407112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-sysconfig\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-host\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-run-ovn\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-run-k8s-cni-cncf-io\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h6w6\" (UniqueName: \"kubernetes.io/projected/be1c69a7-597d-45d8-8fb3-502373eacdd2-kube-api-access-5h6w6\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-sys\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-tuned\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-run-systemd\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-sysctl-d\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-etc-openvswitch\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406569 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-run-openvswitch\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9617f384-f79c-4e0b-86a8-23ef327b0e85-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-run-ovn\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-cnibin\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72da84fe-89a1-4d25-9ab3-667e195d5b53-ovnkube-config\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.407003 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:44.407674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.405978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-systemd-units\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-registration-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-cni-dir\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-run-netns\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-var-lib-kubelet\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407198 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9caa8b02-3e8a-47d0-9956-0368d378aa04-host\") pod \"node-ca-bcm2p\" (UID: \"9caa8b02-3e8a-47d0-9956-0368d378aa04\") " pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-run-multus-certs\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-hostroot\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-run-k8s-cni-cncf-io\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-sys\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-cni-bin\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407355 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-run-systemd\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-cni-netd\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-system-cni-dir\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-registration-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.407477 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs podName:5b48044a-70f3-4dd3-b5c5-d35b911078ea nodeName:}" failed. No retries permitted until 2026-04-23 09:30:44.907428824 +0000 UTC m=+3.062523776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs") pod "network-metrics-daemon-zh888" (UID: "5b48044a-70f3-4dd3-b5c5-d35b911078ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-os-release\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408279 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.406135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-system-cni-dir\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407534 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-run-netns\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-etc-kubernetes\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j42j5\" (UniqueName: \"kubernetes.io/projected/5b48044a-70f3-4dd3-b5c5-d35b911078ea-kube-api-access-j42j5\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-node-log\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5b5b1ef2-e4bf-4e9d-99fd-2cf856085144-host-slash\") pod \"iptables-alerter-lhfct\" (UID: \"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144\") " pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407606 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8cc7fa9d-ac09-438d-a8a9-392735fe315f-konnectivity-ca\") pod \"konnectivity-agent-gkkhl\" (UID: \"8cc7fa9d-ac09-438d-a8a9-392735fe315f\") " pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-device-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f15f26f5-0cd1-4c82-8351-806c77b25958-tmp\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-etc-selinux\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9h2p\" (UniqueName: \"kubernetes.io/projected/5b5b1ef2-e4bf-4e9d-99fd-2cf856085144-kube-api-access-f9h2p\") pod \"iptables-alerter-lhfct\" (UID: \"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144\") " pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-sysctl-conf\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-hostroot\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407960 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-daemon-config\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-systemd\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-sysconfig\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-host\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.408869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408023 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9617f384-f79c-4e0b-86a8-23ef327b0e85-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-socket-dir-parent\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72da84fe-89a1-4d25-9ab3-667e195d5b53-ovnkube-script-lib\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408199 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-etc-openvswitch\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408229 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-run-openvswitch\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408322 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-var-lib-kubelet\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-log-socket\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be1c69a7-597d-45d8-8fb3-502373eacdd2-cni-binary-copy\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408444 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-slash\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-var-lib-cni-bin\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.407537 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9617f384-f79c-4e0b-86a8-23ef327b0e85-cnibin\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408520 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-cni-dir\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5b5b1ef2-e4bf-4e9d-99fd-2cf856085144-host-slash\") pod \"iptables-alerter-lhfct\" (UID: \"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144\") " pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-modprobe-d\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-var-lib-openvswitch\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-lib-modules\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.409850 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-multus-conf-dir\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.410671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408665 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be1c69a7-597d-45d8-8fb3-502373eacdd2-host-run-netns\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.410671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408665 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-device-dir\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.410671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-var-lib-kubelet\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.410671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408698 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72da84fe-89a1-4d25-9ab3-667e195d5b53-host-run-ovn-kubernetes\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.410671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/53dfc537-d64d-46ce-8310-4e177cebb95f-etc-selinux\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.410671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.408746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f15f26f5-0cd1-4c82-8351-806c77b25958-run\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.410671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.409228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5b5b1ef2-e4bf-4e9d-99fd-2cf856085144-iptables-alerter-script\") pod \"iptables-alerter-lhfct\" (UID: \"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144\") " pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.410671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.409274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9caa8b02-3e8a-47d0-9956-0368d378aa04-serviceca\") pod \"node-ca-bcm2p\" (UID: \"9caa8b02-3e8a-47d0-9956-0368d378aa04\") " pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.410671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.410591 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f15f26f5-0cd1-4c82-8351-806c77b25958-etc-tuned\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.411107 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.410965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72da84fe-89a1-4d25-9ab3-667e195d5b53-ovn-node-metrics-cert\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.411301 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.411280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f15f26f5-0cd1-4c82-8351-806c77b25958-tmp\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.423932 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.423909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd9dd\" (UniqueName: \"kubernetes.io/projected/9caa8b02-3e8a-47d0-9956-0368d378aa04-kube-api-access-qd9dd\") pod \"node-ca-bcm2p\" (UID: \"9caa8b02-3e8a-47d0-9956-0368d378aa04\") " pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.430213 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.430192 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:44.430213 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.430210 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:44.430369 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.430221 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gvh9l for pod openshift-network-diagnostics/network-check-target-jrn69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:44.430369 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.430265 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l podName:a3241665-735b-450b-a779-0fc671dcc4d0 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:44.930252522 +0000 UTC m=+3.085347468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gvh9l" (UniqueName: "kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l") pod "network-check-target-jrn69" (UID: "a3241665-735b-450b-a779-0fc671dcc4d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:44.432571 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.432551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h6w6\" (UniqueName: \"kubernetes.io/projected/be1c69a7-597d-45d8-8fb3-502373eacdd2-kube-api-access-5h6w6\") pod \"multus-stttf\" (UID: \"be1c69a7-597d-45d8-8fb3-502373eacdd2\") " pod="openshift-multus/multus-stttf" Apr 23 09:30:44.432755 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.432740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b69qg\" (UniqueName: \"kubernetes.io/projected/53dfc537-d64d-46ce-8310-4e177cebb95f-kube-api-access-b69qg\") pod \"aws-ebs-csi-driver-node-fpwfr\" (UID: \"53dfc537-d64d-46ce-8310-4e177cebb95f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.434283 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.434266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9h2p\" (UniqueName: \"kubernetes.io/projected/5b5b1ef2-e4bf-4e9d-99fd-2cf856085144-kube-api-access-f9h2p\") pod \"iptables-alerter-lhfct\" (UID: \"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144\") " pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.438267 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.438209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kgfh\" (UniqueName: \"kubernetes.io/projected/9617f384-f79c-4e0b-86a8-23ef327b0e85-kube-api-access-9kgfh\") pod \"multus-additional-cni-plugins-hktck\" (UID: \"9617f384-f79c-4e0b-86a8-23ef327b0e85\") " pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.442190 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.441449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8wgx\" (UniqueName: \"kubernetes.io/projected/72da84fe-89a1-4d25-9ab3-667e195d5b53-kube-api-access-b8wgx\") pod \"ovnkube-node-54r9n\" (UID: \"72da84fe-89a1-4d25-9ab3-667e195d5b53\") " pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.443507 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.443482 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j42j5\" (UniqueName: \"kubernetes.io/projected/5b48044a-70f3-4dd3-b5c5-d35b911078ea-kube-api-access-j42j5\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:44.444955 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.444927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d8t4\" (UniqueName: \"kubernetes.io/projected/f15f26f5-0cd1-4c82-8351-806c77b25958-kube-api-access-4d8t4\") pod \"tuned-v6fd9\" (UID: \"f15f26f5-0cd1-4c82-8351-806c77b25958\") " pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.508892 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.508858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8cc7fa9d-ac09-438d-a8a9-392735fe315f-agent-certs\") pod \"konnectivity-agent-gkkhl\" (UID: \"8cc7fa9d-ac09-438d-a8a9-392735fe315f\") " pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:30:44.509074 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.508903 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8cc7fa9d-ac09-438d-a8a9-392735fe315f-konnectivity-ca\") pod \"konnectivity-agent-gkkhl\" (UID: \"8cc7fa9d-ac09-438d-a8a9-392735fe315f\") " pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:30:44.509318 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.509291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8cc7fa9d-ac09-438d-a8a9-392735fe315f-konnectivity-ca\") pod \"konnectivity-agent-gkkhl\" (UID: \"8cc7fa9d-ac09-438d-a8a9-392735fe315f\") " pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:30:44.511394 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.511369 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8cc7fa9d-ac09-438d-a8a9-392735fe315f-agent-certs\") pod \"konnectivity-agent-gkkhl\" (UID: \"8cc7fa9d-ac09-438d-a8a9-392735fe315f\") " pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:30:44.596677 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.596561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" Apr 23 09:30:44.603338 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.603307 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hktck" Apr 23 09:30:44.614032 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.614012 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-stttf" Apr 23 09:30:44.616597 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.616574 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" Apr 23 09:30:44.623118 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.623098 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bcm2p" Apr 23 09:30:44.631679 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.631659 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lhfct" Apr 23 09:30:44.638326 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.638303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:30:44.642875 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.642850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:30:44.724083 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:44.724057 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf15f26f5_0cd1_4c82_8351_806c77b25958.slice/crio-42cdc9780a2098a17d373762d28a9405bb9a6f02a83cab5aed6cb2e38ec231b2 WatchSource:0}: Error finding container 42cdc9780a2098a17d373762d28a9405bb9a6f02a83cab5aed6cb2e38ec231b2: Status 404 returned error can't find the container with id 42cdc9780a2098a17d373762d28a9405bb9a6f02a83cab5aed6cb2e38ec231b2 Apr 23 09:30:44.724897 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:44.724860 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9617f384_f79c_4e0b_86a8_23ef327b0e85.slice/crio-2c10273183e2a13af2c23ce589bb235a3769ad6deaee9c86912648db62cf7216 WatchSource:0}: Error finding container 2c10273183e2a13af2c23ce589bb235a3769ad6deaee9c86912648db62cf7216: Status 404 returned error can't find the container with id 2c10273183e2a13af2c23ce589bb235a3769ad6deaee9c86912648db62cf7216 Apr 23 09:30:44.729248 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:44.729169 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9caa8b02_3e8a_47d0_9956_0368d378aa04.slice/crio-b059e1e787c95694d2d3929d1ed81da728b8b31dc50fa968975d40d09241988d WatchSource:0}: Error finding container b059e1e787c95694d2d3929d1ed81da728b8b31dc50fa968975d40d09241988d: Status 404 returned error can't find the container with id b059e1e787c95694d2d3929d1ed81da728b8b31dc50fa968975d40d09241988d Apr 23 09:30:44.731055 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:44.730916 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53dfc537_d64d_46ce_8310_4e177cebb95f.slice/crio-9fc95ffa7140af0d961c1797a8d003ff28150283b6b6ae9cc3a90b055a7b9a13 WatchSource:0}: Error finding container 9fc95ffa7140af0d961c1797a8d003ff28150283b6b6ae9cc3a90b055a7b9a13: Status 404 returned error can't find the container with id 9fc95ffa7140af0d961c1797a8d003ff28150283b6b6ae9cc3a90b055a7b9a13 Apr 23 09:30:44.731727 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:44.731697 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc7fa9d_ac09_438d_a8a9_392735fe315f.slice/crio-dfc3bb8a8f904052970e615c179752aa7c5f52d1294c675fd53478c5b83942b4 WatchSource:0}: Error finding container dfc3bb8a8f904052970e615c179752aa7c5f52d1294c675fd53478c5b83942b4: Status 404 returned error can't find the container with id dfc3bb8a8f904052970e615c179752aa7c5f52d1294c675fd53478c5b83942b4 Apr 23 09:30:44.732621 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:44.732595 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5b1ef2_e4bf_4e9d_99fd_2cf856085144.slice/crio-88caebbdece54695e88623b8a06d73839c65bb64a848b74ad498126d9050a2e4 WatchSource:0}: Error finding container 88caebbdece54695e88623b8a06d73839c65bb64a848b74ad498126d9050a2e4: Status 404 returned error can't find the container with id 88caebbdece54695e88623b8a06d73839c65bb64a848b74ad498126d9050a2e4 Apr 23 09:30:44.734515 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:44.734375 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72da84fe_89a1_4d25_9ab3_667e195d5b53.slice/crio-78932f5648ca8683cab9792c827bbd5b4bfe5f531080f5fa94de0b2361792eb5 WatchSource:0}: Error finding container 78932f5648ca8683cab9792c827bbd5b4bfe5f531080f5fa94de0b2361792eb5: Status 404 returned error can't find the container with id 78932f5648ca8683cab9792c827bbd5b4bfe5f531080f5fa94de0b2361792eb5 Apr 23 09:30:44.735998 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:30:44.735978 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe1c69a7_597d_45d8_8fb3_502373eacdd2.slice/crio-7803947b997d4c80e96cf12b7f76779e32857aedb297b8d21b4b1364baffa35f WatchSource:0}: Error finding container 7803947b997d4c80e96cf12b7f76779e32857aedb297b8d21b4b1364baffa35f: Status 404 returned error can't find the container with id 7803947b997d4c80e96cf12b7f76779e32857aedb297b8d21b4b1364baffa35f Apr 23 09:30:44.911941 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:44.911869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:44.912071 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.911984 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:44.912071 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:44.912040 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs podName:5b48044a-70f3-4dd3-b5c5-d35b911078ea nodeName:}" failed. No retries permitted until 2026-04-23 09:30:45.912027896 +0000 UTC m=+4.067122842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs") pod "network-metrics-daemon-zh888" (UID: "5b48044a-70f3-4dd3-b5c5-d35b911078ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:45.012377 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.012331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh9l\" (UniqueName: \"kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l\") pod \"network-check-target-jrn69\" (UID: \"a3241665-735b-450b-a779-0fc671dcc4d0\") " pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:45.012553 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:45.012522 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:45.012553 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:45.012547 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:45.012654 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:45.012559 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gvh9l for pod openshift-network-diagnostics/network-check-target-jrn69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:45.012654 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:45.012626 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l podName:a3241665-735b-450b-a779-0fc671dcc4d0 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:46.012603452 +0000 UTC m=+4.167698419 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvh9l" (UniqueName: "kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l") pod "network-check-target-jrn69" (UID: "a3241665-735b-450b-a779-0fc671dcc4d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:45.343615 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.343385 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:45.346510 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.346384 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 09:25:43 +0000 UTC" deadline="2027-11-05 18:31:02.978872108 +0000 UTC" Apr 23 09:30:45.346510 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.346412 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13473h0m17.632464065s" Apr 23 09:30:45.426298 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.426216 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hktck" event={"ID":"9617f384-f79c-4e0b-86a8-23ef327b0e85","Type":"ContainerStarted","Data":"2c10273183e2a13af2c23ce589bb235a3769ad6deaee9c86912648db62cf7216"} Apr 23 09:30:45.432483 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.432422 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" event={"ID":"f15f26f5-0cd1-4c82-8351-806c77b25958","Type":"ContainerStarted","Data":"42cdc9780a2098a17d373762d28a9405bb9a6f02a83cab5aed6cb2e38ec231b2"} Apr 23 09:30:45.436197 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.436146 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-stttf" event={"ID":"be1c69a7-597d-45d8-8fb3-502373eacdd2","Type":"ContainerStarted","Data":"7803947b997d4c80e96cf12b7f76779e32857aedb297b8d21b4b1364baffa35f"} Apr 23 09:30:45.438656 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.438597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerStarted","Data":"78932f5648ca8683cab9792c827bbd5b4bfe5f531080f5fa94de0b2361792eb5"} Apr 23 09:30:45.443318 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.443293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bcm2p" event={"ID":"9caa8b02-3e8a-47d0-9956-0368d378aa04","Type":"ContainerStarted","Data":"b059e1e787c95694d2d3929d1ed81da728b8b31dc50fa968975d40d09241988d"} Apr 23 09:30:45.448512 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.447916 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal" event={"ID":"d85ba369ef6b926856de0defa6be0a9e","Type":"ContainerStarted","Data":"37c4304bf9930681d0537362557905f6f687689fc9b5cc944e6c511eb9495929"} Apr 23 09:30:45.454244 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.454201 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lhfct" event={"ID":"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144","Type":"ContainerStarted","Data":"88caebbdece54695e88623b8a06d73839c65bb64a848b74ad498126d9050a2e4"} Apr 23 09:30:45.456564 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.456519 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gkkhl" event={"ID":"8cc7fa9d-ac09-438d-a8a9-392735fe315f","Type":"ContainerStarted","Data":"dfc3bb8a8f904052970e615c179752aa7c5f52d1294c675fd53478c5b83942b4"} Apr 23 09:30:45.464525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.464486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" event={"ID":"53dfc537-d64d-46ce-8310-4e177cebb95f","Type":"ContainerStarted","Data":"9fc95ffa7140af0d961c1797a8d003ff28150283b6b6ae9cc3a90b055a7b9a13"} Apr 23 09:30:45.919324 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:45.919286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:45.919527 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:45.919507 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:45.919635 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:45.919587 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs podName:5b48044a-70f3-4dd3-b5c5-d35b911078ea nodeName:}" failed. No retries permitted until 2026-04-23 09:30:47.919567341 +0000 UTC m=+6.074662293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs") pod "network-metrics-daemon-zh888" (UID: "5b48044a-70f3-4dd3-b5c5-d35b911078ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:46.019879 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:46.019837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh9l\" (UniqueName: \"kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l\") pod \"network-check-target-jrn69\" (UID: \"a3241665-735b-450b-a779-0fc671dcc4d0\") " pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:46.020047 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:46.019992 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:46.020047 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:46.020012 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:46.020047 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:46.020024 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gvh9l for pod openshift-network-diagnostics/network-check-target-jrn69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:46.020201 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:46.020079 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l podName:a3241665-735b-450b-a779-0fc671dcc4d0 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:48.020060972 +0000 UTC m=+6.175155918 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvh9l" (UniqueName: "kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l") pod "network-check-target-jrn69" (UID: "a3241665-735b-450b-a779-0fc671dcc4d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:46.404710 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:46.404633 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:46.405238 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:46.404802 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:30:46.405339 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:46.405319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:46.405440 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:46.405417 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:30:47.555243 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.555175 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-162.ec2.internal" podStartSLOduration=4.555155886 podStartE2EDuration="4.555155886s" podCreationTimestamp="2026-04-23 09:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:30:45.461704811 +0000 UTC m=+3.616799782" watchObservedRunningTime="2026-04-23 09:30:47.555155886 +0000 UTC m=+5.710250856" Apr 23 09:30:47.556620 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.555904 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6mvdm"] Apr 23 09:30:47.557430 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.557407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:47.557509 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:47.557482 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:30:47.634232 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.634191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:47.634366 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.634266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-dbus\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:47.634366 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.634314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-kubelet-config\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:47.734732 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.734655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-dbus\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:47.734732 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.734718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-kubelet-config\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:47.734960 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.734767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:47.734960 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:47.734921 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:47.735068 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:47.734992 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret podName:7cbf79e1-2f10-446e-b5c9-8905a2c257a9 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:48.234965508 +0000 UTC m=+6.390060474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret") pod "global-pull-secret-syncer-6mvdm" (UID: "7cbf79e1-2f10-446e-b5c9-8905a2c257a9") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:47.735364 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.735339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-dbus\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:47.735453 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.735408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-kubelet-config\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:47.937113 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:47.937072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:47.937288 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:47.937220 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:47.937288 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:47.937283 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs podName:5b48044a-70f3-4dd3-b5c5-d35b911078ea nodeName:}" failed. No retries permitted until 2026-04-23 09:30:51.937265891 +0000 UTC m=+10.092360839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs") pod "network-metrics-daemon-zh888" (UID: "5b48044a-70f3-4dd3-b5c5-d35b911078ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:48.037793 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:48.037693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh9l\" (UniqueName: \"kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l\") pod \"network-check-target-jrn69\" (UID: \"a3241665-735b-450b-a779-0fc671dcc4d0\") " pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:48.037921 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:48.037867 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:48.037921 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:48.037887 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:48.037921 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:48.037899 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gvh9l for pod openshift-network-diagnostics/network-check-target-jrn69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:48.038029 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:48.037959 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l podName:a3241665-735b-450b-a779-0fc671dcc4d0 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:52.037940928 +0000 UTC m=+10.193035880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvh9l" (UniqueName: "kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l") pod "network-check-target-jrn69" (UID: "a3241665-735b-450b-a779-0fc671dcc4d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:48.239797 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:48.239739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:48.239977 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:48.239900 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:48.239977 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:48.239965 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret podName:7cbf79e1-2f10-446e-b5c9-8905a2c257a9 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:49.23994843 +0000 UTC m=+7.395043381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret") pod "global-pull-secret-syncer-6mvdm" (UID: "7cbf79e1-2f10-446e-b5c9-8905a2c257a9") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:48.404838 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:48.404723 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:48.405005 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:48.404894 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:30:48.405171 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:48.405145 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:48.405268 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:48.405244 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:30:48.477167 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:48.477066 2577 generic.go:358] "Generic (PLEG): container finished" podID="8dc30e02f5cc8b2c2c7c8ea212f0c18d" containerID="665767429d6e011c4aa353dcb9a251d8085201b767065ba3df17c9c96d9438e8" exitCode=0 Apr 23 09:30:48.477167 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:48.477122 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" event={"ID":"8dc30e02f5cc8b2c2c7c8ea212f0c18d","Type":"ContainerDied","Data":"665767429d6e011c4aa353dcb9a251d8085201b767065ba3df17c9c96d9438e8"} Apr 23 09:30:49.249043 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:49.249006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:49.249439 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:49.249169 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:49.249439 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:49.249238 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret podName:7cbf79e1-2f10-446e-b5c9-8905a2c257a9 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:51.249222032 +0000 UTC m=+9.404316979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret") pod "global-pull-secret-syncer-6mvdm" (UID: "7cbf79e1-2f10-446e-b5c9-8905a2c257a9") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:49.403806 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:49.403757 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:49.403981 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:49.403895 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:30:50.405598 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:50.405566 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:50.406038 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:50.405673 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:30:50.406038 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:50.405988 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:50.406145 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:50.406068 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:30:51.267522 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:51.267479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:51.267696 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:51.267636 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:51.267756 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:51.267696 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret podName:7cbf79e1-2f10-446e-b5c9-8905a2c257a9 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:55.267680019 +0000 UTC m=+13.422774966 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret") pod "global-pull-secret-syncer-6mvdm" (UID: "7cbf79e1-2f10-446e-b5c9-8905a2c257a9") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:51.404373 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:51.403798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:51.404373 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:51.403996 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:30:51.973888 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:51.973848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:51.974337 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:51.974010 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:51.974337 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:51.974077 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs podName:5b48044a-70f3-4dd3-b5c5-d35b911078ea nodeName:}" failed. No retries permitted until 2026-04-23 09:30:59.974057636 +0000 UTC m=+18.129152591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs") pod "network-metrics-daemon-zh888" (UID: "5b48044a-70f3-4dd3-b5c5-d35b911078ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:52.074668 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.074629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh9l\" (UniqueName: \"kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l\") pod \"network-check-target-jrn69\" (UID: \"a3241665-735b-450b-a779-0fc671dcc4d0\") " pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:52.074854 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:52.074823 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:52.074854 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:52.074843 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:52.074854 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:52.074856 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gvh9l for pod openshift-network-diagnostics/network-check-target-jrn69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:52.075045 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:52.074917 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l podName:a3241665-735b-450b-a779-0fc671dcc4d0 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:00.074898145 +0000 UTC m=+18.229993096 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvh9l" (UniqueName: "kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l") pod "network-check-target-jrn69" (UID: "a3241665-735b-450b-a779-0fc671dcc4d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:52.406027 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.405928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:52.406027 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.405929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:52.406804 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:52.406755 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:30:52.406929 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:52.406879 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:30:52.941595 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.941019 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-82v2z"] Apr 23 09:30:52.943535 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.943510 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:52.946131 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.946108 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-c65vd\"" Apr 23 09:30:52.946354 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.946336 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 09:30:52.946865 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.946846 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 09:30:52.985361 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.985328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4ad34935-1d25-4cee-bc83-179a47166c9e-hosts-file\") pod \"node-resolver-82v2z\" (UID: \"4ad34935-1d25-4cee-bc83-179a47166c9e\") " pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:52.985756 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.985394 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ad34935-1d25-4cee-bc83-179a47166c9e-tmp-dir\") pod \"node-resolver-82v2z\" (UID: \"4ad34935-1d25-4cee-bc83-179a47166c9e\") " pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:52.985756 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:52.985440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9qmj\" (UniqueName: \"kubernetes.io/projected/4ad34935-1d25-4cee-bc83-179a47166c9e-kube-api-access-n9qmj\") pod \"node-resolver-82v2z\" (UID: \"4ad34935-1d25-4cee-bc83-179a47166c9e\") " pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:53.086220 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:53.086183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ad34935-1d25-4cee-bc83-179a47166c9e-tmp-dir\") pod \"node-resolver-82v2z\" (UID: \"4ad34935-1d25-4cee-bc83-179a47166c9e\") " pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:53.086405 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:53.086259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9qmj\" (UniqueName: \"kubernetes.io/projected/4ad34935-1d25-4cee-bc83-179a47166c9e-kube-api-access-n9qmj\") pod \"node-resolver-82v2z\" (UID: \"4ad34935-1d25-4cee-bc83-179a47166c9e\") " pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:53.086405 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:53.086312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4ad34935-1d25-4cee-bc83-179a47166c9e-hosts-file\") pod \"node-resolver-82v2z\" (UID: \"4ad34935-1d25-4cee-bc83-179a47166c9e\") " pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:53.086511 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:53.086407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4ad34935-1d25-4cee-bc83-179a47166c9e-hosts-file\") pod \"node-resolver-82v2z\" (UID: \"4ad34935-1d25-4cee-bc83-179a47166c9e\") " pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:53.086730 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:53.086706 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ad34935-1d25-4cee-bc83-179a47166c9e-tmp-dir\") pod \"node-resolver-82v2z\" (UID: \"4ad34935-1d25-4cee-bc83-179a47166c9e\") " pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:53.100377 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:53.100353 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9qmj\" (UniqueName: \"kubernetes.io/projected/4ad34935-1d25-4cee-bc83-179a47166c9e-kube-api-access-n9qmj\") pod \"node-resolver-82v2z\" (UID: \"4ad34935-1d25-4cee-bc83-179a47166c9e\") " pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:53.255758 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:53.255677 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-82v2z" Apr 23 09:30:53.404605 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:53.404564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:53.404766 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:53.404706 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:30:54.405315 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:54.404759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:54.405315 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:54.404759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:54.405315 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:54.404930 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:30:54.405315 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:54.404967 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:30:55.303520 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:55.303483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:55.303719 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:55.303640 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:55.303719 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:55.303710 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret podName:7cbf79e1-2f10-446e-b5c9-8905a2c257a9 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:03.303691856 +0000 UTC m=+21.458786807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret") pod "global-pull-secret-syncer-6mvdm" (UID: "7cbf79e1-2f10-446e-b5c9-8905a2c257a9") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:55.404471 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:55.404437 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:55.404631 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:55.404554 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:30:56.403968 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:56.403932 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:56.403968 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:56.403929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:56.404542 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:56.404121 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:30:56.404542 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:56.404133 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:30:57.404794 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:57.404752 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:57.405234 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:57.404882 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:30:58.404368 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:58.404331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:30:58.404540 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:58.404331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:30:58.404540 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:58.404470 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:30:58.404540 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:58.404533 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:30:59.404378 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:30:59.404343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:30:59.404847 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:30:59.404462 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:31:00.038840 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:00.038793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:00.039025 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:00.038908 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:31:00.039025 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:00.038995 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs podName:5b48044a-70f3-4dd3-b5c5-d35b911078ea nodeName:}" failed. No retries permitted until 2026-04-23 09:31:16.038966021 +0000 UTC m=+34.194061020 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs") pod "network-metrics-daemon-zh888" (UID: "5b48044a-70f3-4dd3-b5c5-d35b911078ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:31:00.139616 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:00.139574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh9l\" (UniqueName: \"kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l\") pod \"network-check-target-jrn69\" (UID: \"a3241665-735b-450b-a779-0fc671dcc4d0\") " pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:00.139834 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:00.139762 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:31:00.139834 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:00.139804 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:31:00.139834 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:00.139818 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gvh9l for pod openshift-network-diagnostics/network-check-target-jrn69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:31:00.140005 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:00.139886 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l podName:a3241665-735b-450b-a779-0fc671dcc4d0 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:16.139865586 +0000 UTC m=+34.294960539 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvh9l" (UniqueName: "kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l") pod "network-check-target-jrn69" (UID: "a3241665-735b-450b-a779-0fc671dcc4d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:31:00.404745 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:00.404632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:00.404745 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:00.404690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:00.405240 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:00.404819 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:31:00.405240 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:00.404977 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:31:01.404187 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:01.403909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:01.404345 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:01.404227 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:31:01.935883 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:31:01.935847 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad34935_1d25_4cee_bc83_179a47166c9e.slice/crio-ee5e9da0648ff2776285a89c92fe74ce8aeaf5898c7df3e17160c64aa8223c54 WatchSource:0}: Error finding container ee5e9da0648ff2776285a89c92fe74ce8aeaf5898c7df3e17160c64aa8223c54: Status 404 returned error can't find the container with id ee5e9da0648ff2776285a89c92fe74ce8aeaf5898c7df3e17160c64aa8223c54 Apr 23 09:31:02.405234 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.404984 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:02.405321 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.405046 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:02.405369 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:02.405346 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:31:02.405405 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:02.405383 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:31:02.500817 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.500789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-82v2z" event={"ID":"4ad34935-1d25-4cee-bc83-179a47166c9e","Type":"ContainerStarted","Data":"092dc399e70b6a4f51caebd0d99dfca26d26427aa3714546d22e9d1e2ea0b887"} Apr 23 09:31:02.501064 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.500828 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-82v2z" event={"ID":"4ad34935-1d25-4cee-bc83-179a47166c9e","Type":"ContainerStarted","Data":"ee5e9da0648ff2776285a89c92fe74ce8aeaf5898c7df3e17160c64aa8223c54"} Apr 23 09:31:02.502244 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.502221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" event={"ID":"8dc30e02f5cc8b2c2c7c8ea212f0c18d","Type":"ContainerStarted","Data":"4ee4a8d3c6d26b970d676424339d3faffa5e04760384653b20f7137a7335a227"} Apr 23 09:31:02.503323 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.503302 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gkkhl" event={"ID":"8cc7fa9d-ac09-438d-a8a9-392735fe315f","Type":"ContainerStarted","Data":"ed75ed7e1bd55a09079a26b7dcb64ec362da83161631b8fd412f52a3542513ff"} Apr 23 09:31:02.504607 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.504583 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" event={"ID":"53dfc537-d64d-46ce-8310-4e177cebb95f","Type":"ContainerStarted","Data":"2e95340a3073697812d2c71fb243cc55cd9ffa0822d44dae7ee6daa902ab59e2"} Apr 23 09:31:02.505762 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.505736 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hktck" event={"ID":"9617f384-f79c-4e0b-86a8-23ef327b0e85","Type":"ContainerStarted","Data":"2a17ecf2aa50666fe63e8fd32b49111170a7c58fe17cba43b62580c888dfea64"} Apr 23 09:31:02.506818 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.506800 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" event={"ID":"f15f26f5-0cd1-4c82-8351-806c77b25958","Type":"ContainerStarted","Data":"389751834a557bd767aee2c43de76e01c80dd101949132fbfc36091de1ea1237"} Apr 23 09:31:02.507869 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.507844 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-stttf" event={"ID":"be1c69a7-597d-45d8-8fb3-502373eacdd2","Type":"ContainerStarted","Data":"3e6a5e876c023acd7dfaef85bb45dd4382aeff849cc72453e3e82901d0ac133c"} Apr 23 09:31:02.509025 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.509009 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:31:02.509295 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.509277 2577 generic.go:358] "Generic (PLEG): container finished" podID="72da84fe-89a1-4d25-9ab3-667e195d5b53" containerID="4966e6ff0f2d2a57e91a3f3b2348e395a80928c502a4e516fe0f7b74aeb991ad" exitCode=1 Apr 23 09:31:02.509356 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.509330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerDied","Data":"4966e6ff0f2d2a57e91a3f3b2348e395a80928c502a4e516fe0f7b74aeb991ad"} Apr 23 09:31:02.509356 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.509345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerStarted","Data":"f773582e8965f6556799f2f212a32d7b1afc3e86a94d0a9fec6a1e710b25861d"} Apr 23 09:31:02.510417 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.510397 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bcm2p" event={"ID":"9caa8b02-3e8a-47d0-9956-0368d378aa04","Type":"ContainerStarted","Data":"732678863b5f4513dabed25a3ee0572aa1b7ff86e804b5621a971df955119461"} Apr 23 09:31:02.514891 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.514859 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-82v2z" podStartSLOduration=10.514850596 podStartE2EDuration="10.514850596s" podCreationTimestamp="2026-04-23 09:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:31:02.514804451 +0000 UTC m=+20.669899417" watchObservedRunningTime="2026-04-23 09:31:02.514850596 +0000 UTC m=+20.669945565" Apr 23 09:31:02.527749 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.527664 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-162.ec2.internal" podStartSLOduration=19.527653996 podStartE2EDuration="19.527653996s" podCreationTimestamp="2026-04-23 09:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:31:02.527105658 +0000 UTC m=+20.682200626" watchObservedRunningTime="2026-04-23 09:31:02.527653996 +0000 UTC m=+20.682748965" Apr 23 09:31:02.563376 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.563333 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-stttf" podStartSLOduration=3.306595914 podStartE2EDuration="20.56332147s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:30:44.737649227 +0000 UTC m=+2.892744188" lastFinishedPulling="2026-04-23 09:31:01.994374796 +0000 UTC m=+20.149469744" observedRunningTime="2026-04-23 09:31:02.562863368 +0000 UTC m=+20.717958347" watchObservedRunningTime="2026-04-23 09:31:02.56332147 +0000 UTC m=+20.718416439" Apr 23 09:31:02.578240 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.578196 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-v6fd9" podStartSLOduration=3.348486717 podStartE2EDuration="20.578184617s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:30:44.726372077 +0000 UTC m=+2.881467024" lastFinishedPulling="2026-04-23 09:31:01.956069969 +0000 UTC m=+20.111164924" observedRunningTime="2026-04-23 09:31:02.578011541 +0000 UTC m=+20.733106510" watchObservedRunningTime="2026-04-23 09:31:02.578184617 +0000 UTC m=+20.733279586" Apr 23 09:31:02.591578 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:02.591538 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bcm2p" podStartSLOduration=8.12188297 podStartE2EDuration="20.591527385s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:30:44.731331576 +0000 UTC m=+2.886426523" lastFinishedPulling="2026-04-23 09:30:57.200975992 +0000 UTC m=+15.356070938" observedRunningTime="2026-04-23 09:31:02.591387695 +0000 UTC m=+20.746482663" watchObservedRunningTime="2026-04-23 09:31:02.591527385 +0000 UTC m=+20.746622353" Apr 23 09:31:03.366635 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.366458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:03.367191 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:03.366616 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:31:03.367191 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:03.366763 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret podName:7cbf79e1-2f10-446e-b5c9-8905a2c257a9 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:19.366749135 +0000 UTC m=+37.521844082 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret") pod "global-pull-secret-syncer-6mvdm" (UID: "7cbf79e1-2f10-446e-b5c9-8905a2c257a9") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:31:03.395035 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.395008 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 09:31:03.403918 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.403900 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:03.404003 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:03.403986 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:31:03.513905 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.513818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lhfct" event={"ID":"5b5b1ef2-e4bf-4e9d-99fd-2cf856085144","Type":"ContainerStarted","Data":"92d566e9e1b8eef2725fba0828486f20fd5e9404aaa9832e40bebbb511e02077"} Apr 23 09:31:03.515348 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.515322 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" event={"ID":"53dfc537-d64d-46ce-8310-4e177cebb95f","Type":"ContainerStarted","Data":"02427ef1cacd2943c95b75eceb9b4271546343f0c6a910fce26ea4e1efb20935"} Apr 23 09:31:03.516648 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.516622 2577 generic.go:358] "Generic (PLEG): container finished" podID="9617f384-f79c-4e0b-86a8-23ef327b0e85" containerID="2a17ecf2aa50666fe63e8fd32b49111170a7c58fe17cba43b62580c888dfea64" exitCode=0 Apr 23 09:31:03.516798 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.516654 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hktck" event={"ID":"9617f384-f79c-4e0b-86a8-23ef327b0e85","Type":"ContainerDied","Data":"2a17ecf2aa50666fe63e8fd32b49111170a7c58fe17cba43b62580c888dfea64"} Apr 23 09:31:03.519226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.519212 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:31:03.519637 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.519606 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerStarted","Data":"fafe79202ccfd5a5d9ec991f9fe722c7b0bba20878709d94e165ee0e52419aa0"} Apr 23 09:31:03.519637 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.519635 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerStarted","Data":"3ec8610e690db1bc8fc2f75747028702ea110ab95b306094aac6cfcc2f6ff2c1"} Apr 23 09:31:03.519799 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.519649 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerStarted","Data":"af55377d73b6f49f72d3d862bd20e137d215d4d06911463d285164d229763dc2"} Apr 23 09:31:03.519799 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.519663 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerStarted","Data":"ef91a97415761f4baf6a95f676d390125d060560fe30f2449582dafb32a96cc0"} Apr 23 09:31:03.526873 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.526836 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lhfct" podStartSLOduration=4.283950259 podStartE2EDuration="21.5268263s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:30:44.734993694 +0000 UTC m=+2.890088641" lastFinishedPulling="2026-04-23 09:31:01.977869722 +0000 UTC m=+20.132964682" observedRunningTime="2026-04-23 09:31:03.526814613 +0000 UTC m=+21.681909581" watchObservedRunningTime="2026-04-23 09:31:03.5268263 +0000 UTC m=+21.681921269" Apr 23 09:31:03.527055 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:03.527035 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gkkhl" podStartSLOduration=4.328025516 podStartE2EDuration="21.527031164s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:30:44.733828408 +0000 UTC m=+2.888923354" lastFinishedPulling="2026-04-23 09:31:01.932834052 +0000 UTC m=+20.087929002" observedRunningTime="2026-04-23 09:31:02.605384173 +0000 UTC m=+20.760479270" watchObservedRunningTime="2026-04-23 09:31:03.527031164 +0000 UTC m=+21.682126164" Apr 23 09:31:04.375046 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:04.374924 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T09:31:03.3950301Z","UUID":"4af4f91c-2314-447f-96ed-899d04c05cb9","Handler":null,"Name":"","Endpoint":""} Apr 23 09:31:04.376667 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:04.376643 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 09:31:04.376819 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:04.376676 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 09:31:04.403804 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:04.403759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:04.403987 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:04.403759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:04.403987 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:04.403902 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:31:04.404113 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:04.403993 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:31:05.403870 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:05.403793 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:05.404442 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:05.403901 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:31:05.525440 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:05.525399 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" event={"ID":"53dfc537-d64d-46ce-8310-4e177cebb95f","Type":"ContainerStarted","Data":"0902f93f5dcb09e47f5223dd60190fd6b3973adb5774f16677b81e8d94d0a6cf"} Apr 23 09:31:05.528740 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:05.528714 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:31:05.529132 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:05.529090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerStarted","Data":"6d155da5015a24152dabe3af603255e358925f3fb17b121d36d5bbcf0271e405"} Apr 23 09:31:05.543375 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:05.543323 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpwfr" podStartSLOduration=3.672366793 podStartE2EDuration="23.543309477s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:30:44.732528249 +0000 UTC m=+2.887623198" lastFinishedPulling="2026-04-23 09:31:04.603470931 +0000 UTC m=+22.758565882" observedRunningTime="2026-04-23 09:31:05.543306151 +0000 UTC m=+23.698401122" watchObservedRunningTime="2026-04-23 09:31:05.543309477 +0000 UTC m=+23.698404444" Apr 23 09:31:06.404652 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:06.404564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:06.404652 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:06.404606 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:06.405326 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:06.404696 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:31:06.405326 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:06.404854 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:31:07.305815 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:07.305763 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:31:07.306401 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:07.306383 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:31:07.404654 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:07.404623 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:07.405093 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:07.404735 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:31:07.438942 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:07.438906 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:31:07.439613 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:07.439587 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gkkhl" Apr 23 09:31:08.404799 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:08.404527 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:08.404799 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:08.404539 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:08.405406 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:08.404830 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:31:08.405406 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:08.404901 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:31:08.536338 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:08.536308 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:31:08.536657 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:08.536631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerStarted","Data":"bc1daee43384a21426b82bc4659d031f12148bbfbb62b9a3b4cf6a507c1aacde"} Apr 23 09:31:08.537019 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:08.536980 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:31:08.537161 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:08.537146 2577 scope.go:117] "RemoveContainer" containerID="4966e6ff0f2d2a57e91a3f3b2348e395a80928c502a4e516fe0f7b74aeb991ad" Apr 23 09:31:08.538308 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:08.538286 2577 generic.go:358] "Generic (PLEG): container finished" podID="9617f384-f79c-4e0b-86a8-23ef327b0e85" containerID="aa8764a57c9ad31a27e9b296a5d1c066cd24ce735f87d95a92f6decd2eca7fbb" exitCode=0 Apr 23 09:31:08.538404 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:08.538364 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hktck" event={"ID":"9617f384-f79c-4e0b-86a8-23ef327b0e85","Type":"ContainerDied","Data":"aa8764a57c9ad31a27e9b296a5d1c066cd24ce735f87d95a92f6decd2eca7fbb"} Apr 23 09:31:08.554824 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:08.554806 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:31:09.404757 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.404721 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:09.405098 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:09.404897 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:31:09.542302 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.542272 2577 generic.go:358] "Generic (PLEG): container finished" podID="9617f384-f79c-4e0b-86a8-23ef327b0e85" containerID="e1e8b1429986a8334ee5b98b2737a4513755acdda1c83a3ffda486e46765e016" exitCode=0 Apr 23 09:31:09.542427 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.542357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hktck" event={"ID":"9617f384-f79c-4e0b-86a8-23ef327b0e85","Type":"ContainerDied","Data":"e1e8b1429986a8334ee5b98b2737a4513755acdda1c83a3ffda486e46765e016"} Apr 23 09:31:09.546323 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.546300 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:31:09.546681 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.546660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" event={"ID":"72da84fe-89a1-4d25-9ab3-667e195d5b53","Type":"ContainerStarted","Data":"930c9c5d885a10579f37e0f486f1f2b73063b6733645da68011d6f0f726378cd"} Apr 23 09:31:09.546884 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.546862 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 09:31:09.547147 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.547131 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:31:09.562620 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.562594 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:31:09.597373 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.597293 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" podStartSLOduration=10.301400545 podStartE2EDuration="27.597279136s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:30:44.735819505 +0000 UTC m=+2.890914451" lastFinishedPulling="2026-04-23 09:31:02.031698093 +0000 UTC m=+20.186793042" observedRunningTime="2026-04-23 09:31:09.596403596 +0000 UTC m=+27.751498565" watchObservedRunningTime="2026-04-23 09:31:09.597279136 +0000 UTC m=+27.752374104" Apr 23 09:31:09.722980 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.722940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jrn69"] Apr 23 09:31:09.723132 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.723087 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:09.723228 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:09.723204 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:31:09.725404 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.725380 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zh888"] Apr 23 09:31:09.725539 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.725505 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:09.725628 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:09.725609 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:31:09.733486 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.733467 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6mvdm"] Apr 23 09:31:09.733600 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:09.733566 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:09.733678 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:09.733657 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:31:10.548288 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:10.548257 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 09:31:11.404349 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:11.404310 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:11.404349 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:11.404337 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:11.404540 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:11.404434 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:31:11.404540 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:11.404456 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:11.404540 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:11.404516 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:31:11.404635 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:11.404577 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:31:11.552175 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:11.552085 2577 generic.go:358] "Generic (PLEG): container finished" podID="9617f384-f79c-4e0b-86a8-23ef327b0e85" containerID="3f09074e310436b8149282f1ddf84eeed1c7270fd003a47c797194fcab166a20" exitCode=0 Apr 23 09:31:11.552585 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:11.552168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hktck" event={"ID":"9617f384-f79c-4e0b-86a8-23ef327b0e85","Type":"ContainerDied","Data":"3f09074e310436b8149282f1ddf84eeed1c7270fd003a47c797194fcab166a20"} Apr 23 09:31:11.552585 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:11.552391 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 09:31:13.271016 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:13.270976 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:31:13.271637 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:13.271237 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 09:31:13.282263 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:13.282198 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" podUID="72da84fe-89a1-4d25-9ab3-667e195d5b53" containerName="ovnkube-controller" probeResult="failure" output="" Apr 23 09:31:13.292009 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:13.291972 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" podUID="72da84fe-89a1-4d25-9ab3-667e195d5b53" containerName="ovnkube-controller" probeResult="failure" output="" Apr 23 09:31:13.404810 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:13.404760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:13.404997 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:13.404817 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:13.404997 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:13.404944 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mvdm" podUID="7cbf79e1-2f10-446e-b5c9-8905a2c257a9" Apr 23 09:31:13.405204 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:13.405165 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh888" podUID="5b48044a-70f3-4dd3-b5c5-d35b911078ea" Apr 23 09:31:13.405287 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:13.405242 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:13.405371 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:13.405341 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jrn69" podUID="a3241665-735b-450b-a779-0fc671dcc4d0" Apr 23 09:31:15.188154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.188077 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-162.ec2.internal" event="NodeReady" Apr 23 09:31:15.188613 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.188231 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 09:31:15.226426 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.226392 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d4cc87fd9-nwzpg"] Apr 23 09:31:15.229951 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.229929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.232267 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.232240 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 09:31:15.232397 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.232368 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-shblk\"" Apr 23 09:31:15.232643 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.232576 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 09:31:15.232643 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.232576 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 09:31:15.240058 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.238677 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pw8pn"] Apr 23 09:31:15.241754 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.241730 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 09:31:15.243970 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.243951 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.244361 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.244342 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d4cc87fd9-nwzpg"] Apr 23 09:31:15.245170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.245145 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-scc48"] Apr 23 09:31:15.246043 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.246026 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvkwp\"" Apr 23 09:31:15.246133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.246056 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 09:31:15.246133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.246064 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 09:31:15.249873 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.249855 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pw8pn"] Apr 23 09:31:15.250013 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.249998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:15.252073 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.252054 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 09:31:15.252182 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.252093 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 09:31:15.252341 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.252326 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mw7t9\"" Apr 23 09:31:15.252548 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.252535 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 09:31:15.257814 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.257793 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-scc48"] Apr 23 09:31:15.356917 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.356881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpwvz\" (UniqueName: \"kubernetes.io/projected/ffda6afc-6275-4e59-9ea6-12d458a70e5e-kube-api-access-qpwvz\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:15.357101 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.356947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d38301ed-fb73-4e46-aa00-455f456c3ca3-tmp-dir\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.357101 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.356996 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfkd\" (UniqueName: \"kubernetes.io/projected/d38301ed-fb73-4e46-aa00-455f456c3ca3-kube-api-access-mmfkd\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.357101 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:15.357101 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-image-registry-private-configuration\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.357101 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-trusted-ca\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.357101 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d38301ed-fb73-4e46-aa00-455f456c3ca3-config-volume\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.357418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357115 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwkk4\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-kube-api-access-xwkk4\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.357418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.357418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357167 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-installation-pull-secrets\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.357418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-bound-sa-token\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.357418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-certificates\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.357418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357308 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.357418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.357337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/56f8fa46-17ec-4177-a2ca-f092de6a5dad-ca-trust-extracted\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.404463 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.404433 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:15.404631 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.404433 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:15.404631 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.404434 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:15.407116 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.407069 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 09:31:15.407116 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.407110 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 09:31:15.407324 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.407123 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jwhsm\"" Apr 23 09:31:15.407324 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.407160 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gg4dj\"" Apr 23 09:31:15.407324 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.407093 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 09:31:15.407501 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.407474 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 09:31:15.457989 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.457909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwvz\" (UniqueName: \"kubernetes.io/projected/ffda6afc-6275-4e59-9ea6-12d458a70e5e-kube-api-access-qpwvz\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:15.457989 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.457966 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d38301ed-fb73-4e46-aa00-455f456c3ca3-tmp-dir\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.458195 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458155 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfkd\" (UniqueName: \"kubernetes.io/projected/d38301ed-fb73-4e46-aa00-455f456c3ca3-kube-api-access-mmfkd\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.458249 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:15.458249 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-image-registry-private-configuration\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.458312 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-trusted-ca\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.458312 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d38301ed-fb73-4e46-aa00-455f456c3ca3-tmp-dir\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.458312 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d38301ed-fb73-4e46-aa00-455f456c3ca3-config-volume\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.458421 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwkk4\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-kube-api-access-xwkk4\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.458421 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.458365 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:15.458421 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.458421 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-installation-pull-secrets\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.458553 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-bound-sa-token\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.458553 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458499 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-certificates\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.458553 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.458553 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.458552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/56f8fa46-17ec-4177-a2ca-f092de6a5dad-ca-trust-extracted\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.459020 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.458789 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:15.459020 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.458843 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls podName:d38301ed-fb73-4e46-aa00-455f456c3ca3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.958823412 +0000 UTC m=+34.113918362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls") pod "dns-default-pw8pn" (UID: "d38301ed-fb73-4e46-aa00-455f456c3ca3") : secret "dns-default-metrics-tls" not found Apr 23 09:31:15.459020 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.458846 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:15.459020 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.458861 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4cc87fd9-nwzpg: secret "image-registry-tls" not found Apr 23 09:31:15.459020 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.458870 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert podName:ffda6afc-6275-4e59-9ea6-12d458a70e5e nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.958858913 +0000 UTC m=+34.113953868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert") pod "ingress-canary-scc48" (UID: "ffda6afc-6275-4e59-9ea6-12d458a70e5e") : secret "canary-serving-cert" not found Apr 23 09:31:15.459020 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.458902 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls podName:56f8fa46-17ec-4177-a2ca-f092de6a5dad nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.958893695 +0000 UTC m=+34.113988642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls") pod "image-registry-6d4cc87fd9-nwzpg" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad") : secret "image-registry-tls" not found Apr 23 09:31:15.459376 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.459299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-certificates\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.459376 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.459315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d38301ed-fb73-4e46-aa00-455f456c3ca3-config-volume\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.459376 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.459334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/56f8fa46-17ec-4177-a2ca-f092de6a5dad-ca-trust-extracted\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.459496 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.459476 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-trusted-ca\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.463981 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.463666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-image-registry-private-configuration\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.463981 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.463693 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-installation-pull-secrets\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.471439 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.471408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpwvz\" (UniqueName: \"kubernetes.io/projected/ffda6afc-6275-4e59-9ea6-12d458a70e5e-kube-api-access-qpwvz\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:15.472195 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.472174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfkd\" (UniqueName: \"kubernetes.io/projected/d38301ed-fb73-4e46-aa00-455f456c3ca3-kube-api-access-mmfkd\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.472286 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.472264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-bound-sa-token\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.472583 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.472561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwkk4\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-kube-api-access-xwkk4\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.962126 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.962090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:15.962310 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.962150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:15.962310 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:15.962185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:15.962310 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.962257 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:15.962310 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.962304 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:15.962466 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.962319 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4cc87fd9-nwzpg: secret "image-registry-tls" not found Apr 23 09:31:15.962466 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.962332 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:15.962466 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.962337 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert podName:ffda6afc-6275-4e59-9ea6-12d458a70e5e nodeName:}" failed. No retries permitted until 2026-04-23 09:31:16.96232174 +0000 UTC m=+35.117416688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert") pod "ingress-canary-scc48" (UID: "ffda6afc-6275-4e59-9ea6-12d458a70e5e") : secret "canary-serving-cert" not found Apr 23 09:31:15.962466 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.962410 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls podName:56f8fa46-17ec-4177-a2ca-f092de6a5dad nodeName:}" failed. No retries permitted until 2026-04-23 09:31:16.96239011 +0000 UTC m=+35.117485065 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls") pod "image-registry-6d4cc87fd9-nwzpg" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad") : secret "image-registry-tls" not found Apr 23 09:31:15.962466 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:15.962425 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls podName:d38301ed-fb73-4e46-aa00-455f456c3ca3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:16.962415727 +0000 UTC m=+35.117510680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls") pod "dns-default-pw8pn" (UID: "d38301ed-fb73-4e46-aa00-455f456c3ca3") : secret "dns-default-metrics-tls" not found Apr 23 09:31:16.062696 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:16.062661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:16.062881 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:16.062832 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 09:31:16.062928 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:16.062895 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs podName:5b48044a-70f3-4dd3-b5c5-d35b911078ea nodeName:}" failed. No retries permitted until 2026-04-23 09:31:48.062878198 +0000 UTC m=+66.217973145 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs") pod "network-metrics-daemon-zh888" (UID: "5b48044a-70f3-4dd3-b5c5-d35b911078ea") : secret "metrics-daemon-secret" not found Apr 23 09:31:16.163376 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:16.163335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh9l\" (UniqueName: \"kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l\") pod \"network-check-target-jrn69\" (UID: \"a3241665-735b-450b-a779-0fc671dcc4d0\") " pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:16.166659 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:16.166632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvh9l\" (UniqueName: \"kubernetes.io/projected/a3241665-735b-450b-a779-0fc671dcc4d0-kube-api-access-gvh9l\") pod \"network-check-target-jrn69\" (UID: \"a3241665-735b-450b-a779-0fc671dcc4d0\") " pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:16.316999 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:16.316963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:16.971731 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:16.971697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:16.971938 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:16.971756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:16.971938 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:16.971879 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:16.971938 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:16.971882 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:16.972078 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:16.971953 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls podName:d38301ed-fb73-4e46-aa00-455f456c3ca3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:18.971922441 +0000 UTC m=+37.127017388 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls") pod "dns-default-pw8pn" (UID: "d38301ed-fb73-4e46-aa00-455f456c3ca3") : secret "dns-default-metrics-tls" not found Apr 23 09:31:16.972078 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:16.971953 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4cc87fd9-nwzpg: secret "image-registry-tls" not found Apr 23 09:31:16.972078 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:16.971989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:16.972078 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:16.972013 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls podName:56f8fa46-17ec-4177-a2ca-f092de6a5dad nodeName:}" failed. No retries permitted until 2026-04-23 09:31:18.971997218 +0000 UTC m=+37.127092181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls") pod "image-registry-6d4cc87fd9-nwzpg" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad") : secret "image-registry-tls" not found Apr 23 09:31:16.972237 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:16.972102 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:16.972237 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:16.972143 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert podName:ffda6afc-6275-4e59-9ea6-12d458a70e5e nodeName:}" failed. No retries permitted until 2026-04-23 09:31:18.972130709 +0000 UTC m=+37.127225660 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert") pod "ingress-canary-scc48" (UID: "ffda6afc-6275-4e59-9ea6-12d458a70e5e") : secret "canary-serving-cert" not found Apr 23 09:31:17.411108 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:17.411079 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jrn69"] Apr 23 09:31:17.497087 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:31:17.497051 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3241665_735b_450b_a779_0fc671dcc4d0.slice/crio-8a21aa6059a77956550ccdc36512d81e84881869083338661a9ffebc8c89be12 WatchSource:0}: Error finding container 8a21aa6059a77956550ccdc36512d81e84881869083338661a9ffebc8c89be12: Status 404 returned error can't find the container with id 8a21aa6059a77956550ccdc36512d81e84881869083338661a9ffebc8c89be12 Apr 23 09:31:17.564504 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:17.564473 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jrn69" event={"ID":"a3241665-735b-450b-a779-0fc671dcc4d0","Type":"ContainerStarted","Data":"8a21aa6059a77956550ccdc36512d81e84881869083338661a9ffebc8c89be12"} Apr 23 09:31:18.569912 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:18.569879 2577 generic.go:358] "Generic (PLEG): container finished" podID="9617f384-f79c-4e0b-86a8-23ef327b0e85" containerID="e00fc3b75b7b567bc83ac8aabe8327e6e46855e74edf55dd8ecfb8e2fbbf6cca" exitCode=0 Apr 23 09:31:18.570286 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:18.569926 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hktck" event={"ID":"9617f384-f79c-4e0b-86a8-23ef327b0e85","Type":"ContainerDied","Data":"e00fc3b75b7b567bc83ac8aabe8327e6e46855e74edf55dd8ecfb8e2fbbf6cca"} Apr 23 09:31:18.990482 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:18.990405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:18.990482 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:18.990464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:18.990710 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:18.990511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:18.990710 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:18.990581 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:18.990710 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:18.990607 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4cc87fd9-nwzpg: secret "image-registry-tls" not found Apr 23 09:31:18.990710 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:18.990625 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:18.990710 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:18.990626 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:18.990710 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:18.990673 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls podName:56f8fa46-17ec-4177-a2ca-f092de6a5dad nodeName:}" failed. No retries permitted until 2026-04-23 09:31:22.990650504 +0000 UTC m=+41.145745456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls") pod "image-registry-6d4cc87fd9-nwzpg" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad") : secret "image-registry-tls" not found Apr 23 09:31:18.990710 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:18.990699 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls podName:d38301ed-fb73-4e46-aa00-455f456c3ca3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:22.990680956 +0000 UTC m=+41.145775918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls") pod "dns-default-pw8pn" (UID: "d38301ed-fb73-4e46-aa00-455f456c3ca3") : secret "dns-default-metrics-tls" not found Apr 23 09:31:18.990997 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:18.990718 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert podName:ffda6afc-6275-4e59-9ea6-12d458a70e5e nodeName:}" failed. No retries permitted until 2026-04-23 09:31:22.990708475 +0000 UTC m=+41.145803433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert") pod "ingress-canary-scc48" (UID: "ffda6afc-6275-4e59-9ea6-12d458a70e5e") : secret "canary-serving-cert" not found Apr 23 09:31:19.394114 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:19.394078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:19.398384 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:19.398347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cbf79e1-2f10-446e-b5c9-8905a2c257a9-original-pull-secret\") pod \"global-pull-secret-syncer-6mvdm\" (UID: \"7cbf79e1-2f10-446e-b5c9-8905a2c257a9\") " pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:19.575389 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:19.575296 2577 generic.go:358] "Generic (PLEG): container finished" podID="9617f384-f79c-4e0b-86a8-23ef327b0e85" containerID="60b6e667791706434e935d5fdd56ba88615b359ad57532882b16bed527bb875b" exitCode=0 Apr 23 09:31:19.575389 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:19.575365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hktck" event={"ID":"9617f384-f79c-4e0b-86a8-23ef327b0e85","Type":"ContainerDied","Data":"60b6e667791706434e935d5fdd56ba88615b359ad57532882b16bed527bb875b"} Apr 23 09:31:19.623826 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:19.623760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mvdm" Apr 23 09:31:20.400654 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:20.400629 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6mvdm"] Apr 23 09:31:20.404262 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:31:20.404236 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cbf79e1_2f10_446e_b5c9_8905a2c257a9.slice/crio-fd007f91f0efeccb23ad3e45aca376d3e9632a4943d49770a2da7387d105dcc6 WatchSource:0}: Error finding container fd007f91f0efeccb23ad3e45aca376d3e9632a4943d49770a2da7387d105dcc6: Status 404 returned error can't find the container with id fd007f91f0efeccb23ad3e45aca376d3e9632a4943d49770a2da7387d105dcc6 Apr 23 09:31:20.577887 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:20.577804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6mvdm" event={"ID":"7cbf79e1-2f10-446e-b5c9-8905a2c257a9","Type":"ContainerStarted","Data":"fd007f91f0efeccb23ad3e45aca376d3e9632a4943d49770a2da7387d105dcc6"} Apr 23 09:31:20.580585 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:20.580562 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hktck" event={"ID":"9617f384-f79c-4e0b-86a8-23ef327b0e85","Type":"ContainerStarted","Data":"02c00a1bd82affd2e2d280af3cac4421a07763adb4cbf142c40d57aa83f46b9e"} Apr 23 09:31:20.603038 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:20.602966 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hktck" podStartSLOduration=5.801884988 podStartE2EDuration="38.602941474s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:30:44.727004257 +0000 UTC m=+2.882099204" lastFinishedPulling="2026-04-23 09:31:17.528060739 +0000 UTC m=+35.683155690" observedRunningTime="2026-04-23 09:31:20.601673781 +0000 UTC m=+38.756768751" watchObservedRunningTime="2026-04-23 09:31:20.602941474 +0000 UTC m=+38.758036443" Apr 23 09:31:21.584472 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:21.584212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jrn69" event={"ID":"a3241665-735b-450b-a779-0fc671dcc4d0","Type":"ContainerStarted","Data":"d31f924fcd4e520839688ff1fc4e209bc721650da6d3f87bbb90441902f0ad6c"} Apr 23 09:31:21.600112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:21.600060 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jrn69" podStartSLOduration=36.493935601 podStartE2EDuration="39.600045008s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:31:17.505808389 +0000 UTC m=+35.660903347" lastFinishedPulling="2026-04-23 09:31:20.611917806 +0000 UTC m=+38.767012754" observedRunningTime="2026-04-23 09:31:21.59867285 +0000 UTC m=+39.753767819" watchObservedRunningTime="2026-04-23 09:31:21.600045008 +0000 UTC m=+39.755139977" Apr 23 09:31:22.586406 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:22.586374 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:31:23.020903 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:23.020861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:23.021066 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:23.020924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:23.021066 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:23.020954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:23.021066 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:23.021028 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:23.021066 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:23.021046 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:23.021066 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:23.021049 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4cc87fd9-nwzpg: secret "image-registry-tls" not found Apr 23 09:31:23.021246 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:23.021105 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:23.021246 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:23.021111 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls podName:d38301ed-fb73-4e46-aa00-455f456c3ca3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:31.021094891 +0000 UTC m=+49.176189844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls") pod "dns-default-pw8pn" (UID: "d38301ed-fb73-4e46-aa00-455f456c3ca3") : secret "dns-default-metrics-tls" not found Apr 23 09:31:23.021246 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:23.021169 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls podName:56f8fa46-17ec-4177-a2ca-f092de6a5dad nodeName:}" failed. No retries permitted until 2026-04-23 09:31:31.021151535 +0000 UTC m=+49.176246492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls") pod "image-registry-6d4cc87fd9-nwzpg" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad") : secret "image-registry-tls" not found Apr 23 09:31:23.021246 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:23.021185 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert podName:ffda6afc-6275-4e59-9ea6-12d458a70e5e nodeName:}" failed. No retries permitted until 2026-04-23 09:31:31.021175626 +0000 UTC m=+49.176270576 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert") pod "ingress-canary-scc48" (UID: "ffda6afc-6275-4e59-9ea6-12d458a70e5e") : secret "canary-serving-cert" not found Apr 23 09:31:25.593065 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:25.593027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6mvdm" event={"ID":"7cbf79e1-2f10-446e-b5c9-8905a2c257a9","Type":"ContainerStarted","Data":"3b5252ba526c248a5c947f2dcf990a211a5fd6805539ccd0722b0d0c6622d110"} Apr 23 09:31:25.607941 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:25.607893 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6mvdm" podStartSLOduration=33.920544054 podStartE2EDuration="38.607877399s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:31:20.406605302 +0000 UTC m=+38.561700250" lastFinishedPulling="2026-04-23 09:31:25.093938638 +0000 UTC m=+43.249033595" observedRunningTime="2026-04-23 09:31:25.607335885 +0000 UTC m=+43.762430853" watchObservedRunningTime="2026-04-23 09:31:25.607877399 +0000 UTC m=+43.762972367" Apr 23 09:31:27.513399 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:27.513376 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-82v2z_4ad34935-1d25-4cee-bc83-179a47166c9e/dns-node-resolver/0.log" Apr 23 09:31:28.513013 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.512985 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bcm2p_9caa8b02-3e8a-47d0-9956-0368d378aa04/node-ca/0.log" Apr 23 09:31:28.674978 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.674947 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2fqtf"] Apr 23 09:31:28.697937 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.695920 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2fqtf"] Apr 23 09:31:28.697937 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.696050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:28.702023 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.701360 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 09:31:28.705849 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.705116 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 09:31:28.705849 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.705320 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-kcjz2\"" Apr 23 09:31:28.705849 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.705504 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 09:31:28.705849 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.705691 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 09:31:28.759881 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.759850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/75cd7467-d6ae-4c15-b25f-58105fb85a1b-signing-cabundle\") pod \"service-ca-865cb79987-2fqtf\" (UID: \"75cd7467-d6ae-4c15-b25f-58105fb85a1b\") " pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:28.760015 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.759904 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6qxr\" (UniqueName: \"kubernetes.io/projected/75cd7467-d6ae-4c15-b25f-58105fb85a1b-kube-api-access-n6qxr\") pod \"service-ca-865cb79987-2fqtf\" (UID: \"75cd7467-d6ae-4c15-b25f-58105fb85a1b\") " pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:28.760015 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.759984 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/75cd7467-d6ae-4c15-b25f-58105fb85a1b-signing-key\") pod \"service-ca-865cb79987-2fqtf\" (UID: \"75cd7467-d6ae-4c15-b25f-58105fb85a1b\") " pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:28.860591 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.860458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/75cd7467-d6ae-4c15-b25f-58105fb85a1b-signing-key\") pod \"service-ca-865cb79987-2fqtf\" (UID: \"75cd7467-d6ae-4c15-b25f-58105fb85a1b\") " pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:28.860710 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.860615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/75cd7467-d6ae-4c15-b25f-58105fb85a1b-signing-cabundle\") pod \"service-ca-865cb79987-2fqtf\" (UID: \"75cd7467-d6ae-4c15-b25f-58105fb85a1b\") " pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:28.860710 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.860654 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6qxr\" (UniqueName: \"kubernetes.io/projected/75cd7467-d6ae-4c15-b25f-58105fb85a1b-kube-api-access-n6qxr\") pod \"service-ca-865cb79987-2fqtf\" (UID: \"75cd7467-d6ae-4c15-b25f-58105fb85a1b\") " pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:28.861328 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.861308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/75cd7467-d6ae-4c15-b25f-58105fb85a1b-signing-cabundle\") pod \"service-ca-865cb79987-2fqtf\" (UID: \"75cd7467-d6ae-4c15-b25f-58105fb85a1b\") " pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:28.863812 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.863768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/75cd7467-d6ae-4c15-b25f-58105fb85a1b-signing-key\") pod \"service-ca-865cb79987-2fqtf\" (UID: \"75cd7467-d6ae-4c15-b25f-58105fb85a1b\") " pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:28.869396 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:28.869377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6qxr\" (UniqueName: \"kubernetes.io/projected/75cd7467-d6ae-4c15-b25f-58105fb85a1b-kube-api-access-n6qxr\") pod \"service-ca-865cb79987-2fqtf\" (UID: \"75cd7467-d6ae-4c15-b25f-58105fb85a1b\") " pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:29.016172 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:29.016139 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2fqtf" Apr 23 09:31:29.129579 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:29.129490 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2fqtf"] Apr 23 09:31:29.133499 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:31:29.133468 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75cd7467_d6ae_4c15_b25f_58105fb85a1b.slice/crio-36904a65d4738f5fcf95fd9b7edda71005d6d0970a084d8f30d07a9138e8a524 WatchSource:0}: Error finding container 36904a65d4738f5fcf95fd9b7edda71005d6d0970a084d8f30d07a9138e8a524: Status 404 returned error can't find the container with id 36904a65d4738f5fcf95fd9b7edda71005d6d0970a084d8f30d07a9138e8a524 Apr 23 09:31:29.601018 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:29.600977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2fqtf" event={"ID":"75cd7467-d6ae-4c15-b25f-58105fb85a1b","Type":"ContainerStarted","Data":"36904a65d4738f5fcf95fd9b7edda71005d6d0970a084d8f30d07a9138e8a524"} Apr 23 09:31:31.079015 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:31.078955 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:31.079418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:31.079046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:31.079418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:31.079096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:31.079418 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:31.079106 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:31.079418 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:31.079180 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:31.079418 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:31.079190 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert podName:ffda6afc-6275-4e59-9ea6-12d458a70e5e nodeName:}" failed. No retries permitted until 2026-04-23 09:31:47.079169789 +0000 UTC m=+65.234264755 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert") pod "ingress-canary-scc48" (UID: "ffda6afc-6275-4e59-9ea6-12d458a70e5e") : secret "canary-serving-cert" not found Apr 23 09:31:31.079418 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:31.079194 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4cc87fd9-nwzpg: secret "image-registry-tls" not found Apr 23 09:31:31.079418 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:31.079193 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:31.079418 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:31.079235 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls podName:56f8fa46-17ec-4177-a2ca-f092de6a5dad nodeName:}" failed. No retries permitted until 2026-04-23 09:31:47.079225511 +0000 UTC m=+65.234320457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls") pod "image-registry-6d4cc87fd9-nwzpg" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad") : secret "image-registry-tls" not found Apr 23 09:31:31.079418 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:31:31.079256 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls podName:d38301ed-fb73-4e46-aa00-455f456c3ca3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:47.079242559 +0000 UTC m=+65.234337510 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls") pod "dns-default-pw8pn" (UID: "d38301ed-fb73-4e46-aa00-455f456c3ca3") : secret "dns-default-metrics-tls" not found Apr 23 09:31:32.607912 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:32.607875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2fqtf" event={"ID":"75cd7467-d6ae-4c15-b25f-58105fb85a1b","Type":"ContainerStarted","Data":"272c9c61abaed21faa94ca2dfe21f013c34954936ce8d4f0bd862f45428f6f10"} Apr 23 09:31:32.622186 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:32.622142 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-2fqtf" podStartSLOduration=2.252746317 podStartE2EDuration="4.622128789s" podCreationTimestamp="2026-04-23 09:31:28 +0000 UTC" firstStartedPulling="2026-04-23 09:31:29.135943069 +0000 UTC m=+47.291038018" lastFinishedPulling="2026-04-23 09:31:31.505325543 +0000 UTC m=+49.660420490" observedRunningTime="2026-04-23 09:31:32.621685793 +0000 UTC m=+50.776780761" watchObservedRunningTime="2026-04-23 09:31:32.622128789 +0000 UTC m=+50.777223757" Apr 23 09:31:43.292546 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:43.292514 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-54r9n" Apr 23 09:31:47.096141 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.096101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:47.096141 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.096147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:47.096548 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.096177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:47.098663 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.098634 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffda6afc-6275-4e59-9ea6-12d458a70e5e-cert\") pod \"ingress-canary-scc48\" (UID: \"ffda6afc-6275-4e59-9ea6-12d458a70e5e\") " pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:47.098788 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.098736 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"image-registry-6d4cc87fd9-nwzpg\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:47.098888 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.098872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38301ed-fb73-4e46-aa00-455f456c3ca3-metrics-tls\") pod \"dns-default-pw8pn\" (UID: \"d38301ed-fb73-4e46-aa00-455f456c3ca3\") " pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:47.350475 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.350395 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-shblk\"" Apr 23 09:31:47.357328 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.357297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:47.361204 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.361189 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvkwp\"" Apr 23 09:31:47.366203 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.366174 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mw7t9\"" Apr 23 09:31:47.368324 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.368299 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:47.373993 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.373963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-scc48" Apr 23 09:31:47.531460 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.531436 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d4cc87fd9-nwzpg"] Apr 23 09:31:47.532528 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:31:47.532501 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f8fa46_17ec_4177_a2ca_f092de6a5dad.slice/crio-03341cd0b55d69737a9ae3ff897cb162d140caaed85c5725a31529912e72b712 WatchSource:0}: Error finding container 03341cd0b55d69737a9ae3ff897cb162d140caaed85c5725a31529912e72b712: Status 404 returned error can't find the container with id 03341cd0b55d69737a9ae3ff897cb162d140caaed85c5725a31529912e72b712 Apr 23 09:31:47.641114 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.641038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" event={"ID":"56f8fa46-17ec-4177-a2ca-f092de6a5dad","Type":"ContainerStarted","Data":"a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b"} Apr 23 09:31:47.641114 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.641073 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" event={"ID":"56f8fa46-17ec-4177-a2ca-f092de6a5dad","Type":"ContainerStarted","Data":"03341cd0b55d69737a9ae3ff897cb162d140caaed85c5725a31529912e72b712"} Apr 23 09:31:47.641329 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.641162 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:31:47.667974 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.667920 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" podStartSLOduration=58.66790585 podStartE2EDuration="58.66790585s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:31:47.66353944 +0000 UTC m=+65.818634421" watchObservedRunningTime="2026-04-23 09:31:47.66790585 +0000 UTC m=+65.823000819" Apr 23 09:31:47.731134 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.731097 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pw8pn"] Apr 23 09:31:47.731805 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.731763 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-scc48"] Apr 23 09:31:47.734084 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:31:47.734060 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd38301ed_fb73_4e46_aa00_455f456c3ca3.slice/crio-3bc4e588c0707eb96d51fe2bac9306c01197f2a19100c4e6f91c54fb1ea56644 WatchSource:0}: Error finding container 3bc4e588c0707eb96d51fe2bac9306c01197f2a19100c4e6f91c54fb1ea56644: Status 404 returned error can't find the container with id 3bc4e588c0707eb96d51fe2bac9306c01197f2a19100c4e6f91c54fb1ea56644 Apr 23 09:31:47.734968 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:31:47.734935 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffda6afc_6275_4e59_9ea6_12d458a70e5e.slice/crio-13dd01c87a043b713fbdafcfddaf1fc58c7c7dfd398971c6488011537fa86a08 WatchSource:0}: Error finding container 13dd01c87a043b713fbdafcfddaf1fc58c7c7dfd398971c6488011537fa86a08: Status 404 returned error can't find the container with id 13dd01c87a043b713fbdafcfddaf1fc58c7c7dfd398971c6488011537fa86a08 Apr 23 09:31:47.867259 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.867226 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rfmg9"] Apr 23 09:31:47.870133 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.870111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:47.873225 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.873201 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 09:31:47.873225 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.873218 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 09:31:47.873427 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.873413 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 09:31:47.873540 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.873524 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 09:31:47.873602 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.873550 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qqv29\"" Apr 23 09:31:47.896146 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.896062 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rfmg9"] Apr 23 09:31:47.902221 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.902195 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:47.902354 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.902238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-data-volume\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:47.902354 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.902264 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgzdc\" (UniqueName: \"kubernetes.io/projected/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-kube-api-access-bgzdc\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:47.902442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.902343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-crio-socket\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:47.902442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.902400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:47.905038 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:47.905017 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d4cc87fd9-nwzpg"] Apr 23 09:31:48.002948 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.002915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-data-volume\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.002948 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.002950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgzdc\" (UniqueName: \"kubernetes.io/projected/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-kube-api-access-bgzdc\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.003188 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.002995 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-crio-socket\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.003188 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.003054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.003188 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.003099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.003188 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.003148 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-crio-socket\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.003375 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.003278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-data-volume\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.003560 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.003540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.005470 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.005447 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.015155 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.015133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgzdc\" (UniqueName: \"kubernetes.io/projected/b8da361e-89a3-41c1-9d56-1a4f1d0e6798-kube-api-access-bgzdc\") pod \"insights-runtime-extractor-rfmg9\" (UID: \"b8da361e-89a3-41c1-9d56-1a4f1d0e6798\") " pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.103857 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.103823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:48.106236 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.106214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b48044a-70f3-4dd3-b5c5-d35b911078ea-metrics-certs\") pod \"network-metrics-daemon-zh888\" (UID: \"5b48044a-70f3-4dd3-b5c5-d35b911078ea\") " pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:48.131208 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.131182 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gg4dj\"" Apr 23 09:31:48.139452 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.139429 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh888" Apr 23 09:31:48.179577 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.179549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rfmg9" Apr 23 09:31:48.274632 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.274599 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zh888"] Apr 23 09:31:48.283737 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:31:48.283702 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b48044a_70f3_4dd3_b5c5_d35b911078ea.slice/crio-522bc14cb50b9bd24b705da0a1623017400f8569c70ee2321af9db3deb585a96 WatchSource:0}: Error finding container 522bc14cb50b9bd24b705da0a1623017400f8569c70ee2321af9db3deb585a96: Status 404 returned error can't find the container with id 522bc14cb50b9bd24b705da0a1623017400f8569c70ee2321af9db3deb585a96 Apr 23 09:31:48.325058 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.325020 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rfmg9"] Apr 23 09:31:48.330364 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:31:48.330322 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8da361e_89a3_41c1_9d56_1a4f1d0e6798.slice/crio-aaf3a7d75b896597d875f7004c1377134c363327d813b4ed28344ad04762e7a6 WatchSource:0}: Error finding container aaf3a7d75b896597d875f7004c1377134c363327d813b4ed28344ad04762e7a6: Status 404 returned error can't find the container with id aaf3a7d75b896597d875f7004c1377134c363327d813b4ed28344ad04762e7a6 Apr 23 09:31:48.647428 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.647378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zh888" event={"ID":"5b48044a-70f3-4dd3-b5c5-d35b911078ea","Type":"ContainerStarted","Data":"522bc14cb50b9bd24b705da0a1623017400f8569c70ee2321af9db3deb585a96"} Apr 23 09:31:48.648976 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.648896 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pw8pn" event={"ID":"d38301ed-fb73-4e46-aa00-455f456c3ca3","Type":"ContainerStarted","Data":"3bc4e588c0707eb96d51fe2bac9306c01197f2a19100c4e6f91c54fb1ea56644"} Apr 23 09:31:48.650601 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.650574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rfmg9" event={"ID":"b8da361e-89a3-41c1-9d56-1a4f1d0e6798","Type":"ContainerStarted","Data":"e6821637ac7eb42b23bddab637940ce5555c266d821472c75054900fcdffb6cb"} Apr 23 09:31:48.650709 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.650608 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rfmg9" event={"ID":"b8da361e-89a3-41c1-9d56-1a4f1d0e6798","Type":"ContainerStarted","Data":"aaf3a7d75b896597d875f7004c1377134c363327d813b4ed28344ad04762e7a6"} Apr 23 09:31:48.652442 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:48.652413 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-scc48" event={"ID":"ffda6afc-6275-4e59-9ea6-12d458a70e5e","Type":"ContainerStarted","Data":"13dd01c87a043b713fbdafcfddaf1fc58c7c7dfd398971c6488011537fa86a08"} Apr 23 09:31:50.659246 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:50.659207 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rfmg9" event={"ID":"b8da361e-89a3-41c1-9d56-1a4f1d0e6798","Type":"ContainerStarted","Data":"5b4bd4a1e61296c279f55176c56448c09251cd7c15bc3649001c369aed30a0b5"} Apr 23 09:31:50.660481 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:50.660453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-scc48" event={"ID":"ffda6afc-6275-4e59-9ea6-12d458a70e5e","Type":"ContainerStarted","Data":"67f234a7c81899ad0fc95c43473face003d6d6cbc23a69d66e17cf34ee59e84d"} Apr 23 09:31:50.663144 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:50.662806 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pw8pn" event={"ID":"d38301ed-fb73-4e46-aa00-455f456c3ca3","Type":"ContainerStarted","Data":"d5be46e08e972ff227ef63cbef3b23eee678697161d32cd611d8bc1b21a12300"} Apr 23 09:31:50.663144 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:50.662840 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pw8pn" event={"ID":"d38301ed-fb73-4e46-aa00-455f456c3ca3","Type":"ContainerStarted","Data":"2996cf4254a291efe6e89006d2895a29bfd7548706d2bb4c8331bb56a38d708e"} Apr 23 09:31:50.663304 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:50.663213 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pw8pn" Apr 23 09:31:50.695149 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:50.695070 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-scc48" podStartSLOduration=33.15631875 podStartE2EDuration="35.695059401s" podCreationTimestamp="2026-04-23 09:31:15 +0000 UTC" firstStartedPulling="2026-04-23 09:31:47.737090363 +0000 UTC m=+65.892185310" lastFinishedPulling="2026-04-23 09:31:50.27583101 +0000 UTC m=+68.430925961" observedRunningTime="2026-04-23 09:31:50.693852338 +0000 UTC m=+68.848947303" watchObservedRunningTime="2026-04-23 09:31:50.695059401 +0000 UTC m=+68.850154361" Apr 23 09:31:50.725809 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:50.725711 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pw8pn" podStartSLOduration=33.18604242 podStartE2EDuration="35.725698156s" podCreationTimestamp="2026-04-23 09:31:15 +0000 UTC" firstStartedPulling="2026-04-23 09:31:47.736173924 +0000 UTC m=+65.891268870" lastFinishedPulling="2026-04-23 09:31:50.275829658 +0000 UTC m=+68.430924606" observedRunningTime="2026-04-23 09:31:50.724690902 +0000 UTC m=+68.879785872" watchObservedRunningTime="2026-04-23 09:31:50.725698156 +0000 UTC m=+68.880793125" Apr 23 09:31:51.667733 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:51.667691 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zh888" event={"ID":"5b48044a-70f3-4dd3-b5c5-d35b911078ea","Type":"ContainerStarted","Data":"5d213e1ec42592f593dd34f1d96fb27f39f355f1b78d6b9f20b32cb7ac446fb0"} Apr 23 09:31:52.672063 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:52.672025 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rfmg9" event={"ID":"b8da361e-89a3-41c1-9d56-1a4f1d0e6798","Type":"ContainerStarted","Data":"7e11cd200689b724063761209b861fea1f3aef9b1540680334f944992ae404de"} Apr 23 09:31:52.673490 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:52.673461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zh888" event={"ID":"5b48044a-70f3-4dd3-b5c5-d35b911078ea","Type":"ContainerStarted","Data":"50a1383b7219f0c2b35b0de25521d0155eed9f1d80c2d7e877ed410d3c2cdbd8"} Apr 23 09:31:52.690336 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:52.690288 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rfmg9" podStartSLOduration=2.141286454 podStartE2EDuration="5.690275098s" podCreationTimestamp="2026-04-23 09:31:47 +0000 UTC" firstStartedPulling="2026-04-23 09:31:48.398472759 +0000 UTC m=+66.553567708" lastFinishedPulling="2026-04-23 09:31:51.94746139 +0000 UTC m=+70.102556352" observedRunningTime="2026-04-23 09:31:52.689169805 +0000 UTC m=+70.844264773" watchObservedRunningTime="2026-04-23 09:31:52.690275098 +0000 UTC m=+70.845370061" Apr 23 09:31:52.704265 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:52.704215 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zh888" podStartSLOduration=67.519566174 podStartE2EDuration="1m10.704200464s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:31:48.286190932 +0000 UTC m=+66.441285882" lastFinishedPulling="2026-04-23 09:31:51.470825209 +0000 UTC m=+69.625920172" observedRunningTime="2026-04-23 09:31:52.702687538 +0000 UTC m=+70.857782506" watchObservedRunningTime="2026-04-23 09:31:52.704200464 +0000 UTC m=+70.859295435" Apr 23 09:31:53.591009 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:31:53.590967 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jrn69" Apr 23 09:32:00.669835 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:00.669705 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pw8pn" Apr 23 09:32:09.660795 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:09.660757 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:32:12.199905 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.199876 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-847d9cd844-8h57x"] Apr 23 09:32:12.203829 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.203811 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.206227 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.206202 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 09:32:12.207105 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.207084 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 09:32:12.207219 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.207163 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 09:32:12.207280 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.207228 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 09:32:12.207280 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.207239 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 09:32:12.207280 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.207232 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 09:32:12.207280 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.207250 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xfsvc\"" Apr 23 09:32:12.207280 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.207088 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 09:32:12.214317 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.214246 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-847d9cd844-8h57x"] Apr 23 09:32:12.251056 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.251027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-oauth-serving-cert\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.251194 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.251064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-serving-cert\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.251194 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.251080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-service-ca\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.251194 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.251098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74gs\" (UniqueName: \"kubernetes.io/projected/914265a0-79d7-4eb7-9d72-6b1a419f229e-kube-api-access-c74gs\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.251194 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.251137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-config\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.251194 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.251177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-oauth-config\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.351520 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.351489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-oauth-serving-cert\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.351699 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.351526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-serving-cert\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.351699 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.351553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-service-ca\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.351699 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.351574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c74gs\" (UniqueName: \"kubernetes.io/projected/914265a0-79d7-4eb7-9d72-6b1a419f229e-kube-api-access-c74gs\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.351699 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.351596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-config\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.352018 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.351720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-oauth-config\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.352269 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.352245 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-oauth-serving-cert\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.352950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.352931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-service-ca\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.353013 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.352993 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-config\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.354090 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.354072 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-oauth-config\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.354228 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.354210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-serving-cert\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.359213 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.359182 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74gs\" (UniqueName: \"kubernetes.io/projected/914265a0-79d7-4eb7-9d72-6b1a419f229e-kube-api-access-c74gs\") pod \"console-847d9cd844-8h57x\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.513088 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.513052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:12.635986 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.635950 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-847d9cd844-8h57x"] Apr 23 09:32:12.638800 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:12.638753 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod914265a0_79d7_4eb7_9d72_6b1a419f229e.slice/crio-e824bf56d6f5effad27660950ed4b2f71f18102dbe8e8f1784b308f86f6fce60 WatchSource:0}: Error finding container e824bf56d6f5effad27660950ed4b2f71f18102dbe8e8f1784b308f86f6fce60: Status 404 returned error can't find the container with id e824bf56d6f5effad27660950ed4b2f71f18102dbe8e8f1784b308f86f6fce60 Apr 23 09:32:12.726895 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:12.726860 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847d9cd844-8h57x" event={"ID":"914265a0-79d7-4eb7-9d72-6b1a419f229e","Type":"ContainerStarted","Data":"e824bf56d6f5effad27660950ed4b2f71f18102dbe8e8f1784b308f86f6fce60"} Apr 23 09:32:13.540258 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.540225 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7ccbc4855d-7lnmg"] Apr 23 09:32:13.541849 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.541821 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.549590 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.549532 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 09:32:13.553980 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.553955 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ccbc4855d-7lnmg"] Apr 23 09:32:13.660504 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.660465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-console-config\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.660504 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.660502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-oauth-config\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.660825 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.660533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-oauth-serving-cert\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.660825 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.660611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-serving-cert\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.660825 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.660710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hchqm\" (UniqueName: \"kubernetes.io/projected/27384921-96c4-40b0-9acc-195969899794-kube-api-access-hchqm\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.660825 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.660746 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-service-ca\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.660968 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.660856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-trusted-ca-bundle\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.761737 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.761696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hchqm\" (UniqueName: \"kubernetes.io/projected/27384921-96c4-40b0-9acc-195969899794-kube-api-access-hchqm\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.761935 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.761747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-service-ca\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.761935 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.761797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-trusted-ca-bundle\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.764137 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.762158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-console-config\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.764137 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.762368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-oauth-config\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.764137 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.762562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-oauth-serving-cert\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.764137 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.763232 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-service-ca\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.764137 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.763272 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-console-config\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.764137 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.763318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-serving-cert\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.764137 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.763531 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-oauth-serving-cert\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.767626 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.767605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-serving-cert\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.767877 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.767855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-trusted-ca-bundle\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.768234 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.768211 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-oauth-config\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.777935 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.777916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hchqm\" (UniqueName: \"kubernetes.io/projected/27384921-96c4-40b0-9acc-195969899794-kube-api-access-hchqm\") pod \"console-7ccbc4855d-7lnmg\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.854939 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.854860 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:13.992387 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:13.992359 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ccbc4855d-7lnmg"] Apr 23 09:32:13.995413 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:13.995376 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27384921_96c4_40b0_9acc_195969899794.slice/crio-bb8e40fcef1e10dc3377f9440f1fccd521205b74c202323852efa8c826fab94f WatchSource:0}: Error finding container bb8e40fcef1e10dc3377f9440f1fccd521205b74c202323852efa8c826fab94f: Status 404 returned error can't find the container with id bb8e40fcef1e10dc3377f9440f1fccd521205b74c202323852efa8c826fab94f Apr 23 09:32:14.089171 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.089139 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl"] Apr 23 09:32:14.092063 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.092046 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" Apr 23 09:32:14.094397 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.094373 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-28hbz\"" Apr 23 09:32:14.094492 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.094411 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 09:32:14.099500 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.099469 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl"] Apr 23 09:32:14.167319 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.167236 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f5d15c32-85eb-45ba-8ab7-d34f9c152d7c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wvdnl\" (UID: \"f5d15c32-85eb-45ba-8ab7-d34f9c152d7c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" Apr 23 09:32:14.267646 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.267606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f5d15c32-85eb-45ba-8ab7-d34f9c152d7c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wvdnl\" (UID: \"f5d15c32-85eb-45ba-8ab7-d34f9c152d7c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" Apr 23 09:32:14.267860 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:14.267807 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 23 09:32:14.267923 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:14.267893 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5d15c32-85eb-45ba-8ab7-d34f9c152d7c-tls-certificates podName:f5d15c32-85eb-45ba-8ab7-d34f9c152d7c nodeName:}" failed. No retries permitted until 2026-04-23 09:32:14.767870642 +0000 UTC m=+92.922965595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f5d15c32-85eb-45ba-8ab7-d34f9c152d7c-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-wvdnl" (UID: "f5d15c32-85eb-45ba-8ab7-d34f9c152d7c") : secret "prometheus-operator-admission-webhook-tls" not found Apr 23 09:32:14.676759 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.676708 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" podUID="56f8fa46-17ec-4177-a2ca-f092de6a5dad" containerName="registry" containerID="cri-o://a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b" gracePeriod=30 Apr 23 09:32:14.735637 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.735590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ccbc4855d-7lnmg" event={"ID":"27384921-96c4-40b0-9acc-195969899794","Type":"ContainerStarted","Data":"bb8e40fcef1e10dc3377f9440f1fccd521205b74c202323852efa8c826fab94f"} Apr 23 09:32:14.773295 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.773258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f5d15c32-85eb-45ba-8ab7-d34f9c152d7c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wvdnl\" (UID: \"f5d15c32-85eb-45ba-8ab7-d34f9c152d7c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" Apr 23 09:32:14.776311 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.776283 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f5d15c32-85eb-45ba-8ab7-d34f9c152d7c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wvdnl\" (UID: \"f5d15c32-85eb-45ba-8ab7-d34f9c152d7c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" Apr 23 09:32:14.959178 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:14.959156 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:32:15.002207 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.001762 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" Apr 23 09:32:15.075922 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.075889 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwkk4\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-kube-api-access-xwkk4\") pod \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " Apr 23 09:32:15.076106 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.075966 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-image-registry-private-configuration\") pod \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " Apr 23 09:32:15.076106 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.076010 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-bound-sa-token\") pod \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " Apr 23 09:32:15.076106 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.076038 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/56f8fa46-17ec-4177-a2ca-f092de6a5dad-ca-trust-extracted\") pod \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " Apr 23 09:32:15.076106 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.076065 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-installation-pull-secrets\") pod \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " Apr 23 09:32:15.076106 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.076093 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-certificates\") pod \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " Apr 23 09:32:15.076326 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.076143 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-trusted-ca\") pod \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " Apr 23 09:32:15.076326 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.076172 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") pod \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\" (UID: \"56f8fa46-17ec-4177-a2ca-f092de6a5dad\") " Apr 23 09:32:15.076855 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.076681 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "56f8fa46-17ec-4177-a2ca-f092de6a5dad" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:15.076855 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.076811 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "56f8fa46-17ec-4177-a2ca-f092de6a5dad" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:15.079562 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.079528 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "56f8fa46-17ec-4177-a2ca-f092de6a5dad" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:32:15.079799 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.079704 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "56f8fa46-17ec-4177-a2ca-f092de6a5dad" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:32:15.079889 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.079854 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "56f8fa46-17ec-4177-a2ca-f092de6a5dad" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:32:15.081020 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.080994 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "56f8fa46-17ec-4177-a2ca-f092de6a5dad" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:32:15.083232 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.083199 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-kube-api-access-xwkk4" (OuterVolumeSpecName: "kube-api-access-xwkk4") pod "56f8fa46-17ec-4177-a2ca-f092de6a5dad" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad"). InnerVolumeSpecName "kube-api-access-xwkk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:32:15.087568 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.087541 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f8fa46-17ec-4177-a2ca-f092de6a5dad-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "56f8fa46-17ec-4177-a2ca-f092de6a5dad" (UID: "56f8fa46-17ec-4177-a2ca-f092de6a5dad"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:32:15.177509 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.177449 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-image-registry-private-configuration\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:15.177509 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.177511 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-bound-sa-token\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:15.177738 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.177547 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/56f8fa46-17ec-4177-a2ca-f092de6a5dad-ca-trust-extracted\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:15.177738 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.177563 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/56f8fa46-17ec-4177-a2ca-f092de6a5dad-installation-pull-secrets\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:15.177738 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.177577 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-certificates\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:15.177738 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.177591 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56f8fa46-17ec-4177-a2ca-f092de6a5dad-trusted-ca\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:15.177738 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.177622 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-registry-tls\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:15.177738 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.177636 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwkk4\" (UniqueName: \"kubernetes.io/projected/56f8fa46-17ec-4177-a2ca-f092de6a5dad-kube-api-access-xwkk4\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:15.739985 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.739946 2577 generic.go:358] "Generic (PLEG): container finished" podID="56f8fa46-17ec-4177-a2ca-f092de6a5dad" containerID="a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b" exitCode=0 Apr 23 09:32:15.740417 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.740018 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" Apr 23 09:32:15.740417 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.740051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" event={"ID":"56f8fa46-17ec-4177-a2ca-f092de6a5dad","Type":"ContainerDied","Data":"a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b"} Apr 23 09:32:15.740417 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.740081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d4cc87fd9-nwzpg" event={"ID":"56f8fa46-17ec-4177-a2ca-f092de6a5dad","Type":"ContainerDied","Data":"03341cd0b55d69737a9ae3ff897cb162d140caaed85c5725a31529912e72b712"} Apr 23 09:32:15.740417 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.740101 2577 scope.go:117] "RemoveContainer" containerID="a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b" Apr 23 09:32:15.767184 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.767152 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d4cc87fd9-nwzpg"] Apr 23 09:32:15.771061 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:15.771036 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d4cc87fd9-nwzpg"] Apr 23 09:32:16.122802 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:16.122682 2577 scope.go:117] "RemoveContainer" containerID="a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b" Apr 23 09:32:16.123144 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:16.123114 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b\": container with ID starting with a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b not found: ID does not exist" containerID="a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b" Apr 23 09:32:16.123199 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:16.123161 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b"} err="failed to get container status \"a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b\": rpc error: code = NotFound desc = could not find container \"a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b\": container with ID starting with a0e875b1ada7f5c65b6ca22aacd1c7293232353127db73316ccfe4fd8606751b not found: ID does not exist" Apr 23 09:32:16.249139 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:16.249112 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl"] Apr 23 09:32:16.252143 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:16.252110 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5d15c32_85eb_45ba_8ab7_d34f9c152d7c.slice/crio-da55dc34db90bf67ab421a803cdbe1aa4b6f0e6ae4e162f9079ef9b9dc89d7bc WatchSource:0}: Error finding container da55dc34db90bf67ab421a803cdbe1aa4b6f0e6ae4e162f9079ef9b9dc89d7bc: Status 404 returned error can't find the container with id da55dc34db90bf67ab421a803cdbe1aa4b6f0e6ae4e162f9079ef9b9dc89d7bc Apr 23 09:32:16.408130 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:16.408050 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f8fa46-17ec-4177-a2ca-f092de6a5dad" path="/var/lib/kubelet/pods/56f8fa46-17ec-4177-a2ca-f092de6a5dad/volumes" Apr 23 09:32:16.746787 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:16.746739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ccbc4855d-7lnmg" event={"ID":"27384921-96c4-40b0-9acc-195969899794","Type":"ContainerStarted","Data":"18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001"} Apr 23 09:32:16.747747 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:16.747728 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" event={"ID":"f5d15c32-85eb-45ba-8ab7-d34f9c152d7c","Type":"ContainerStarted","Data":"da55dc34db90bf67ab421a803cdbe1aa4b6f0e6ae4e162f9079ef9b9dc89d7bc"} Apr 23 09:32:16.749417 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:16.749394 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847d9cd844-8h57x" event={"ID":"914265a0-79d7-4eb7-9d72-6b1a419f229e","Type":"ContainerStarted","Data":"64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f"} Apr 23 09:32:16.765851 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:16.765814 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7ccbc4855d-7lnmg" podStartSLOduration=1.599287467 podStartE2EDuration="3.765804002s" podCreationTimestamp="2026-04-23 09:32:13 +0000 UTC" firstStartedPulling="2026-04-23 09:32:13.997612836 +0000 UTC m=+92.152707783" lastFinishedPulling="2026-04-23 09:32:16.164129356 +0000 UTC m=+94.319224318" observedRunningTime="2026-04-23 09:32:16.764760401 +0000 UTC m=+94.919855403" watchObservedRunningTime="2026-04-23 09:32:16.765804002 +0000 UTC m=+94.920898967" Apr 23 09:32:16.823131 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:16.823081 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-847d9cd844-8h57x" podStartSLOduration=1.2876679979999999 podStartE2EDuration="4.823068946s" podCreationTimestamp="2026-04-23 09:32:12 +0000 UTC" firstStartedPulling="2026-04-23 09:32:12.640603693 +0000 UTC m=+90.795698640" lastFinishedPulling="2026-04-23 09:32:16.176004634 +0000 UTC m=+94.331099588" observedRunningTime="2026-04-23 09:32:16.821897801 +0000 UTC m=+94.976992768" watchObservedRunningTime="2026-04-23 09:32:16.823068946 +0000 UTC m=+94.978163915" Apr 23 09:32:17.753704 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:17.753670 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" event={"ID":"f5d15c32-85eb-45ba-8ab7-d34f9c152d7c","Type":"ContainerStarted","Data":"6795ca7f0ec3422bfa67882fda0b2a713de4db990e3e877a3f8bcd07e5c6f712"} Apr 23 09:32:17.769268 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:17.769222 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" podStartSLOduration=2.334174728 podStartE2EDuration="3.769209091s" podCreationTimestamp="2026-04-23 09:32:14 +0000 UTC" firstStartedPulling="2026-04-23 09:32:16.25393759 +0000 UTC m=+94.409032541" lastFinishedPulling="2026-04-23 09:32:17.688971958 +0000 UTC m=+95.844066904" observedRunningTime="2026-04-23 09:32:17.768979464 +0000 UTC m=+95.924074433" watchObservedRunningTime="2026-04-23 09:32:17.769209091 +0000 UTC m=+95.924304059" Apr 23 09:32:18.756840 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:18.756809 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" Apr 23 09:32:18.761423 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:18.761400 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wvdnl" Apr 23 09:32:19.180968 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.180888 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gcq96"] Apr 23 09:32:19.181145 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.181133 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f8fa46-17ec-4177-a2ca-f092de6a5dad" containerName="registry" Apr 23 09:32:19.181210 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.181147 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f8fa46-17ec-4177-a2ca-f092de6a5dad" containerName="registry" Apr 23 09:32:19.181210 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.181189 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="56f8fa46-17ec-4177-a2ca-f092de6a5dad" containerName="registry" Apr 23 09:32:19.182880 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.182863 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.186068 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.186033 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 09:32:19.186068 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.186057 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 09:32:19.186248 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.186074 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 09:32:19.186248 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.186117 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-g7r9c\"" Apr 23 09:32:19.186248 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.186057 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 09:32:19.186389 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.186315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 09:32:19.191093 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.191073 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gcq96"] Apr 23 09:32:19.307871 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.307834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60052d67-8854-44ac-9a6b-d9c4acc92c52-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.308028 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.307922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.308028 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.307946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv76p\" (UniqueName: \"kubernetes.io/projected/60052d67-8854-44ac-9a6b-d9c4acc92c52-kube-api-access-wv76p\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.308028 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.308008 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.408455 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.408427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.408601 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.408476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60052d67-8854-44ac-9a6b-d9c4acc92c52-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.408601 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:19.408561 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 09:32:19.408703 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:19.408616 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-tls podName:60052d67-8854-44ac-9a6b-d9c4acc92c52 nodeName:}" failed. No retries permitted until 2026-04-23 09:32:19.908601619 +0000 UTC m=+98.063696566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-gcq96" (UID: "60052d67-8854-44ac-9a6b-d9c4acc92c52") : secret "prometheus-operator-tls" not found Apr 23 09:32:19.408760 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.408733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.408834 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.408802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wv76p\" (UniqueName: \"kubernetes.io/projected/60052d67-8854-44ac-9a6b-d9c4acc92c52-kube-api-access-wv76p\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.409097 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.409078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60052d67-8854-44ac-9a6b-d9c4acc92c52-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.411226 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.411208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.417187 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.417165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv76p\" (UniqueName: \"kubernetes.io/projected/60052d67-8854-44ac-9a6b-d9c4acc92c52-kube-api-access-wv76p\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.914557 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:19.914515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:19.915075 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:19.914648 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 09:32:19.915075 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:19.914725 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-tls podName:60052d67-8854-44ac-9a6b-d9c4acc92c52 nodeName:}" failed. No retries permitted until 2026-04-23 09:32:20.914702999 +0000 UTC m=+99.069797946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-gcq96" (UID: "60052d67-8854-44ac-9a6b-d9c4acc92c52") : secret "prometheus-operator-tls" not found Apr 23 09:32:20.921750 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:20.921709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:20.924277 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:20.924256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/60052d67-8854-44ac-9a6b-d9c4acc92c52-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gcq96\" (UID: \"60052d67-8854-44ac-9a6b-d9c4acc92c52\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:20.992313 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:20.992270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" Apr 23 09:32:21.106148 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:21.106120 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gcq96"] Apr 23 09:32:21.108483 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:21.108443 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60052d67_8854_44ac_9a6b_d9c4acc92c52.slice/crio-6865f70837080945f605d37a513aa32b9f4bf230bfe8867748ef3ae73c57357a WatchSource:0}: Error finding container 6865f70837080945f605d37a513aa32b9f4bf230bfe8867748ef3ae73c57357a: Status 404 returned error can't find the container with id 6865f70837080945f605d37a513aa32b9f4bf230bfe8867748ef3ae73c57357a Apr 23 09:32:21.766358 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:21.766313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" event={"ID":"60052d67-8854-44ac-9a6b-d9c4acc92c52","Type":"ContainerStarted","Data":"6865f70837080945f605d37a513aa32b9f4bf230bfe8867748ef3ae73c57357a"} Apr 23 09:32:22.514888 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:22.514143 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:22.514888 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:22.514197 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:22.520153 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:22.520117 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:22.771008 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:22.770971 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" event={"ID":"60052d67-8854-44ac-9a6b-d9c4acc92c52","Type":"ContainerStarted","Data":"3cb73828e6c49b2098c6be505030f998f599b54c21290773f2aaf4307ea19cc9"} Apr 23 09:32:22.771008 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:22.771017 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" event={"ID":"60052d67-8854-44ac-9a6b-d9c4acc92c52","Type":"ContainerStarted","Data":"8497b255bc9ff2127be7a4e49c1c688424c5855b0b522ce0b02391dcf307afcb"} Apr 23 09:32:22.775166 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:22.775140 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:22.790481 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:22.790426 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-gcq96" podStartSLOduration=2.572885962 podStartE2EDuration="3.790413017s" podCreationTimestamp="2026-04-23 09:32:19 +0000 UTC" firstStartedPulling="2026-04-23 09:32:21.110265061 +0000 UTC m=+99.265360008" lastFinishedPulling="2026-04-23 09:32:22.327792115 +0000 UTC m=+100.482887063" observedRunningTime="2026-04-23 09:32:22.789143527 +0000 UTC m=+100.944238487" watchObservedRunningTime="2026-04-23 09:32:22.790413017 +0000 UTC m=+100.945507985" Apr 23 09:32:23.855524 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:23.855490 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:23.855524 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:23.855524 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:23.860211 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:23.860189 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:24.551041 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.551007 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-88zx9"] Apr 23 09:32:24.553034 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.553002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.555265 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.555242 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 09:32:24.555379 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.555344 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-cdnqx\"" Apr 23 09:32:24.555379 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.555364 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 09:32:24.555678 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.555661 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 09:32:24.648658 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.648618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/31d81257-086f-4891-af00-db82e4feba37-root\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.648854 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.648668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/31d81257-086f-4891-af00-db82e4feba37-node-exporter-tls\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.648854 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.648701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/31d81257-086f-4891-af00-db82e4feba37-node-exporter-wtmp\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.648854 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.648794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/31d81257-086f-4891-af00-db82e4feba37-node-exporter-accelerators-collector-config\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.648854 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.648833 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/31d81257-086f-4891-af00-db82e4feba37-sys\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.649083 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.648863 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/31d81257-086f-4891-af00-db82e4feba37-node-exporter-textfile\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.649083 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.648905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31d81257-086f-4891-af00-db82e4feba37-metrics-client-ca\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.649083 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.648972 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxlg\" (UniqueName: \"kubernetes.io/projected/31d81257-086f-4891-af00-db82e4feba37-kube-api-access-bxxlg\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.649083 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.649010 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/31d81257-086f-4891-af00-db82e4feba37-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.749866 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.749814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/31d81257-086f-4891-af00-db82e4feba37-node-exporter-accelerators-collector-config\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.749866 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.749867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/31d81257-086f-4891-af00-db82e4feba37-sys\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.749897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/31d81257-086f-4891-af00-db82e4feba37-node-exporter-textfile\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.749936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31d81257-086f-4891-af00-db82e4feba37-metrics-client-ca\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.749969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxlg\" (UniqueName: \"kubernetes.io/projected/31d81257-086f-4891-af00-db82e4feba37-kube-api-access-bxxlg\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.749998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/31d81257-086f-4891-af00-db82e4feba37-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.750027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/31d81257-086f-4891-af00-db82e4feba37-root\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750112 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.750063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/31d81257-086f-4891-af00-db82e4feba37-node-exporter-tls\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750395 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.750059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/31d81257-086f-4891-af00-db82e4feba37-sys\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750395 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.750085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/31d81257-086f-4891-af00-db82e4feba37-node-exporter-wtmp\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750395 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.750235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/31d81257-086f-4891-af00-db82e4feba37-node-exporter-wtmp\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750540 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.750497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/31d81257-086f-4891-af00-db82e4feba37-node-exporter-accelerators-collector-config\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750669 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.750651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/31d81257-086f-4891-af00-db82e4feba37-root\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.750764 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:24.750747 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 09:32:24.750940 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:24.750928 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31d81257-086f-4891-af00-db82e4feba37-node-exporter-tls podName:31d81257-086f-4891-af00-db82e4feba37 nodeName:}" failed. No retries permitted until 2026-04-23 09:32:25.250906451 +0000 UTC m=+103.406001420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/31d81257-086f-4891-af00-db82e4feba37-node-exporter-tls") pod "node-exporter-88zx9" (UID: "31d81257-086f-4891-af00-db82e4feba37") : secret "node-exporter-tls" not found Apr 23 09:32:24.751034 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.750956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/31d81257-086f-4891-af00-db82e4feba37-node-exporter-textfile\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.751034 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.750966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31d81257-086f-4891-af00-db82e4feba37-metrics-client-ca\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.753138 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.753114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/31d81257-086f-4891-af00-db82e4feba37-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.763634 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.763604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxlg\" (UniqueName: \"kubernetes.io/projected/31d81257-086f-4891-af00-db82e4feba37-kube-api-access-bxxlg\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:24.784238 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.784065 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:32:24.836031 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:24.835950 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-847d9cd844-8h57x"] Apr 23 09:32:25.254665 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.254630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/31d81257-086f-4891-af00-db82e4feba37-node-exporter-tls\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:25.257326 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.257298 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/31d81257-086f-4891-af00-db82e4feba37-node-exporter-tls\") pod \"node-exporter-88zx9\" (UID: \"31d81257-086f-4891-af00-db82e4feba37\") " pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:25.462675 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.462637 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-88zx9" Apr 23 09:32:25.470557 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:25.470530 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31d81257_086f_4891_af00_db82e4feba37.slice/crio-09112c33894bb26e912773e484962a1086cfcced5592bf18e522525c21916c50 WatchSource:0}: Error finding container 09112c33894bb26e912773e484962a1086cfcced5592bf18e522525c21916c50: Status 404 returned error can't find the container with id 09112c33894bb26e912773e484962a1086cfcced5592bf18e522525c21916c50 Apr 23 09:32:25.684656 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.684560 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 09:32:25.687702 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.687677 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.690170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.690006 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 09:32:25.690170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.690003 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 09:32:25.690170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.690032 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 09:32:25.690170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.690036 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 09:32:25.690170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.690066 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 09:32:25.690550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.690304 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 09:32:25.690550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.690411 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 09:32:25.690550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.690419 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5j6m5\"" Apr 23 09:32:25.690550 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.690427 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 09:32:25.698717 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.698696 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 09:32:25.702854 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.702833 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 09:32:25.759426 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759606 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-web-config\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759606 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759606 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f9deca1-8e6e-4a72-8f05-18ee01034792-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759606 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759561 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f9deca1-8e6e-4a72-8f05-18ee01034792-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759606 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f9deca1-8e6e-4a72-8f05-18ee01034792-config-out\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759845 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-config-volume\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759845 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759845 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9deca1-8e6e-4a72-8f05-18ee01034792-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759845 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llv24\" (UniqueName: \"kubernetes.io/projected/1f9deca1-8e6e-4a72-8f05-18ee01034792-kube-api-access-llv24\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759845 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759845 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1f9deca1-8e6e-4a72-8f05-18ee01034792-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.759845 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.759839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.785656 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.785615 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88zx9" event={"ID":"31d81257-086f-4891-af00-db82e4feba37","Type":"ContainerStarted","Data":"09112c33894bb26e912773e484962a1086cfcced5592bf18e522525c21916c50"} Apr 23 09:32:25.860666 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.860620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.860666 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.860669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-web-config\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.860931 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.860715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.860931 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.860742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f9deca1-8e6e-4a72-8f05-18ee01034792-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.860931 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.860807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f9deca1-8e6e-4a72-8f05-18ee01034792-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.860931 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.860831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f9deca1-8e6e-4a72-8f05-18ee01034792-config-out\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.862149 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.861725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-config-volume\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.862149 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.861812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.862149 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.862040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f9deca1-8e6e-4a72-8f05-18ee01034792-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.863541 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.862402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9deca1-8e6e-4a72-8f05-18ee01034792-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.863541 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.862462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llv24\" (UniqueName: \"kubernetes.io/projected/1f9deca1-8e6e-4a72-8f05-18ee01034792-kube-api-access-llv24\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.863541 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.862502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.863541 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.862525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1f9deca1-8e6e-4a72-8f05-18ee01034792-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.863541 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.862575 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.863541 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.863436 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1f9deca1-8e6e-4a72-8f05-18ee01034792-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.863541 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.863497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9deca1-8e6e-4a72-8f05-18ee01034792-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.866027 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.866000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.866335 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.866306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.867195 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.866810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f9deca1-8e6e-4a72-8f05-18ee01034792-config-out\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.867195 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.867161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-config-volume\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.867711 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.867662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.867711 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.867688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-web-config\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.868054 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.868015 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.868735 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.868693 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1f9deca1-8e6e-4a72-8f05-18ee01034792-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.869463 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.869334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f9deca1-8e6e-4a72-8f05-18ee01034792-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.875338 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.874868 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llv24\" (UniqueName: \"kubernetes.io/projected/1f9deca1-8e6e-4a72-8f05-18ee01034792-kube-api-access-llv24\") pod \"alertmanager-main-0\" (UID: \"1f9deca1-8e6e-4a72-8f05-18ee01034792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:25.999702 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:25.999674 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 09:32:26.237236 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:26.237084 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 09:32:26.239370 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:26.239343 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f9deca1_8e6e_4a72_8f05_18ee01034792.slice/crio-e5ef03a47eb20758f2eab07fa0873965c04e42b8084f80295f01d1a19bf228c1 WatchSource:0}: Error finding container e5ef03a47eb20758f2eab07fa0873965c04e42b8084f80295f01d1a19bf228c1: Status 404 returned error can't find the container with id e5ef03a47eb20758f2eab07fa0873965c04e42b8084f80295f01d1a19bf228c1 Apr 23 09:32:26.790065 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:26.790024 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f9deca1-8e6e-4a72-8f05-18ee01034792","Type":"ContainerStarted","Data":"e5ef03a47eb20758f2eab07fa0873965c04e42b8084f80295f01d1a19bf228c1"} Apr 23 09:32:26.791721 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:26.791690 2577 generic.go:358] "Generic (PLEG): container finished" podID="31d81257-086f-4891-af00-db82e4feba37" containerID="ee186401fec14a2712c20542ce4100b7e9dcf7a4e6ce83cc3c7deec18049bd43" exitCode=0 Apr 23 09:32:26.791868 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:26.791744 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88zx9" event={"ID":"31d81257-086f-4891-af00-db82e4feba37","Type":"ContainerDied","Data":"ee186401fec14a2712c20542ce4100b7e9dcf7a4e6ce83cc3c7deec18049bd43"} Apr 23 09:32:27.796854 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:27.796818 2577 generic.go:358] "Generic (PLEG): container finished" podID="1f9deca1-8e6e-4a72-8f05-18ee01034792" containerID="179621f0f6fc35d2b8f2f9fea48e8d804eb6957115873d0a897586ecbba10c70" exitCode=0 Apr 23 09:32:27.797318 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:27.796921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f9deca1-8e6e-4a72-8f05-18ee01034792","Type":"ContainerDied","Data":"179621f0f6fc35d2b8f2f9fea48e8d804eb6957115873d0a897586ecbba10c70"} Apr 23 09:32:27.798940 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:27.798921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88zx9" event={"ID":"31d81257-086f-4891-af00-db82e4feba37","Type":"ContainerStarted","Data":"87ffe942fff2c7daca22cf4d49d78b8133d659b0eb9496e47fd3bdb1ee8d29fa"} Apr 23 09:32:27.798940 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:27.798949 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88zx9" event={"ID":"31d81257-086f-4891-af00-db82e4feba37","Type":"ContainerStarted","Data":"992062c2f9e41f337e3a6f2c78681238dcc47ad331493d1c7b49e34f97137e11"} Apr 23 09:32:27.843198 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:27.843132 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-88zx9" podStartSLOduration=3.18565488 podStartE2EDuration="3.843115428s" podCreationTimestamp="2026-04-23 09:32:24 +0000 UTC" firstStartedPulling="2026-04-23 09:32:25.472390017 +0000 UTC m=+103.627484964" lastFinishedPulling="2026-04-23 09:32:26.129850565 +0000 UTC m=+104.284945512" observedRunningTime="2026-04-23 09:32:27.841565545 +0000 UTC m=+105.996660537" watchObservedRunningTime="2026-04-23 09:32:27.843115428 +0000 UTC m=+105.998210397" Apr 23 09:32:28.864861 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.864824 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-55c9c986d8-bsgq5"] Apr 23 09:32:28.869194 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.869162 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:28.872156 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.872129 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 09:32:28.872287 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.872196 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 09:32:28.873302 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.873104 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 09:32:28.873302 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.873156 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-v3aphjnknj4h\"" Apr 23 09:32:28.873302 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.873203 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 09:32:28.873302 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.873104 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-qdpvl\"" Apr 23 09:32:28.880955 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.880932 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55c9c986d8-bsgq5"] Apr 23 09:32:28.992132 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.992075 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd24c38-657c-425c-a27a-09e527b26e51-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:28.992132 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.992128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3dd24c38-657c-425c-a27a-09e527b26e51-metrics-server-audit-profiles\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:28.992384 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.992200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcknk\" (UniqueName: \"kubernetes.io/projected/3dd24c38-657c-425c-a27a-09e527b26e51-kube-api-access-xcknk\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:28.992384 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.992242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3dd24c38-657c-425c-a27a-09e527b26e51-audit-log\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:28.992384 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.992266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3dd24c38-657c-425c-a27a-09e527b26e51-secret-metrics-server-tls\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:28.992384 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.992348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3dd24c38-657c-425c-a27a-09e527b26e51-secret-metrics-server-client-certs\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:28.992384 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:28.992373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd24c38-657c-425c-a27a-09e527b26e51-client-ca-bundle\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.093131 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.093101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd24c38-657c-425c-a27a-09e527b26e51-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.093234 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.093137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3dd24c38-657c-425c-a27a-09e527b26e51-metrics-server-audit-profiles\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.093234 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.093156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcknk\" (UniqueName: \"kubernetes.io/projected/3dd24c38-657c-425c-a27a-09e527b26e51-kube-api-access-xcknk\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.093234 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.093179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3dd24c38-657c-425c-a27a-09e527b26e51-audit-log\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.093234 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.093209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3dd24c38-657c-425c-a27a-09e527b26e51-secret-metrics-server-tls\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.093404 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.093265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3dd24c38-657c-425c-a27a-09e527b26e51-secret-metrics-server-client-certs\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.093404 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.093287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd24c38-657c-425c-a27a-09e527b26e51-client-ca-bundle\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.093648 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.093623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3dd24c38-657c-425c-a27a-09e527b26e51-audit-log\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.093928 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.093902 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd24c38-657c-425c-a27a-09e527b26e51-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.094193 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.094098 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3dd24c38-657c-425c-a27a-09e527b26e51-metrics-server-audit-profiles\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.095547 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.095520 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd24c38-657c-425c-a27a-09e527b26e51-client-ca-bundle\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.095705 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.095685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3dd24c38-657c-425c-a27a-09e527b26e51-secret-metrics-server-tls\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.095761 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.095703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3dd24c38-657c-425c-a27a-09e527b26e51-secret-metrics-server-client-certs\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.101525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.101501 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcknk\" (UniqueName: \"kubernetes.io/projected/3dd24c38-657c-425c-a27a-09e527b26e51-kube-api-access-xcknk\") pod \"metrics-server-55c9c986d8-bsgq5\" (UID: \"3dd24c38-657c-425c-a27a-09e527b26e51\") " pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.181635 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.181609 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:29.322153 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.322126 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55c9c986d8-bsgq5"] Apr 23 09:32:29.325473 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:29.325443 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd24c38_657c_425c_a27a_09e527b26e51.slice/crio-3e102b145db38e2229de5c3136a4f029becf39dae6b1679c1d16a9029a0261d5 WatchSource:0}: Error finding container 3e102b145db38e2229de5c3136a4f029becf39dae6b1679c1d16a9029a0261d5: Status 404 returned error can't find the container with id 3e102b145db38e2229de5c3136a4f029becf39dae6b1679c1d16a9029a0261d5 Apr 23 09:32:29.336079 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.335930 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75bb568db7-wtvw8"] Apr 23 09:32:29.338378 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.338023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.347834 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.347759 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75bb568db7-wtvw8"] Apr 23 09:32:29.496993 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.496957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-service-ca\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.497169 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.496997 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-oauth-config\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.497169 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.497105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-console-config\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.497169 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.497135 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-serving-cert\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.497288 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.497179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-trusted-ca-bundle\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.497288 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.497215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8mmd\" (UniqueName: \"kubernetes.io/projected/dfca5a52-7560-4f74-88c0-1ea067add34f-kube-api-access-r8mmd\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.497353 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.497303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-oauth-serving-cert\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.597891 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.597850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-oauth-config\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.597891 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.597894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-console-config\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.598080 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.597912 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-serving-cert\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.598216 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.598193 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-trusted-ca-bundle\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.598291 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.598243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8mmd\" (UniqueName: \"kubernetes.io/projected/dfca5a52-7560-4f74-88c0-1ea067add34f-kube-api-access-r8mmd\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.598363 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.598344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-oauth-serving-cert\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.598412 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.598395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-service-ca\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.598656 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.598635 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-console-config\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.598955 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.598937 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-trusted-ca-bundle\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.599026 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.598956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-oauth-serving-cert\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.599026 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.599018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-service-ca\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.601005 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.600986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-oauth-config\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.601048 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.600986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-serving-cert\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.606520 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.606504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8mmd\" (UniqueName: \"kubernetes.io/projected/dfca5a52-7560-4f74-88c0-1ea067add34f-kube-api-access-r8mmd\") pod \"console-75bb568db7-wtvw8\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.652603 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.652528 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:29.759552 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.759496 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-695f97b69f-4zff8"] Apr 23 09:32:29.762856 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.762840 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:29.767100 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.767062 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 09:32:29.767331 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.767316 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 09:32:29.767436 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.767418 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 09:32:29.767510 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.767494 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 09:32:29.767662 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.767647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 09:32:29.770520 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.770503 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-krwrs\"" Apr 23 09:32:29.775035 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.775017 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 09:32:29.793686 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.793661 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-695f97b69f-4zff8"] Apr 23 09:32:29.805964 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.804064 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75bb568db7-wtvw8"] Apr 23 09:32:29.807955 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.807932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" event={"ID":"3dd24c38-657c-425c-a27a-09e527b26e51","Type":"ContainerStarted","Data":"3e102b145db38e2229de5c3136a4f029becf39dae6b1679c1d16a9029a0261d5"} Apr 23 09:32:29.809269 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:29.809246 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfca5a52_7560_4f74_88c0_1ea067add34f.slice/crio-76566ce10491cd74aaab5eb70e81d0aea49dc5e94986df535342479dea429e51 WatchSource:0}: Error finding container 76566ce10491cd74aaab5eb70e81d0aea49dc5e94986df535342479dea429e51: Status 404 returned error can't find the container with id 76566ce10491cd74aaab5eb70e81d0aea49dc5e94986df535342479dea429e51 Apr 23 09:32:29.810598 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.810576 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f9deca1-8e6e-4a72-8f05-18ee01034792","Type":"ContainerStarted","Data":"9e87c7c46181972e55576416ed2acd854e42f662fb89a55ae8fd22e0bc06f02f"} Apr 23 09:32:29.810674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.810602 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f9deca1-8e6e-4a72-8f05-18ee01034792","Type":"ContainerStarted","Data":"8952f39af411d20d07a459577073fe63ef0b3e1ab4ed291cb73a507bc223833a"} Apr 23 09:32:29.810674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.810614 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f9deca1-8e6e-4a72-8f05-18ee01034792","Type":"ContainerStarted","Data":"61dd5d695ed1e1d2f05aeded748739e5e2e32694c835d24776e6bde26c05910a"} Apr 23 09:32:29.810674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.810623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f9deca1-8e6e-4a72-8f05-18ee01034792","Type":"ContainerStarted","Data":"8b05fa41405b02f8bdb5bdcb2a565b3c347b1a093d6faf56632700afa53888e0"} Apr 23 09:32:29.810674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.810630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f9deca1-8e6e-4a72-8f05-18ee01034792","Type":"ContainerStarted","Data":"07f988d8279dfe732e27e3fb9ebe67f1e54602e78d93711407f27150d76b9e5e"} Apr 23 09:32:29.901535 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.901501 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-secret-telemeter-client\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:29.901964 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.901561 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a099632-59e6-471e-9564-387d2c572c9f-metrics-client-ca\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:29.901964 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.901621 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-telemeter-client-tls\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:29.901964 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.901665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-federate-client-tls\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:29.901964 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.901695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a099632-59e6-471e-9564-387d2c572c9f-serving-certs-ca-bundle\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:29.901964 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.901736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:29.901964 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.901796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a099632-59e6-471e-9564-387d2c572c9f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:29.901964 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:29.901830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dh9m\" (UniqueName: \"kubernetes.io/projected/0a099632-59e6-471e-9564-387d2c572c9f-kube-api-access-9dh9m\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.003390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.002921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a099632-59e6-471e-9564-387d2c572c9f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.003390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.002976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dh9m\" (UniqueName: \"kubernetes.io/projected/0a099632-59e6-471e-9564-387d2c572c9f-kube-api-access-9dh9m\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.003390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.003012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-secret-telemeter-client\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.003390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.003048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a099632-59e6-471e-9564-387d2c572c9f-metrics-client-ca\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.003390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.003130 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-telemeter-client-tls\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.003390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.003171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-federate-client-tls\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.003390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.003202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a099632-59e6-471e-9564-387d2c572c9f-serving-certs-ca-bundle\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.003390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.003244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.003909 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.003853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a099632-59e6-471e-9564-387d2c572c9f-metrics-client-ca\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.004093 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.004062 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a099632-59e6-471e-9564-387d2c572c9f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.004293 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.004243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a099632-59e6-471e-9564-387d2c572c9f-serving-certs-ca-bundle\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.006181 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.006141 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-secret-telemeter-client\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.006575 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.006556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-telemeter-client-tls\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.006653 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.006616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-federate-client-tls\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.006736 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.006711 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a099632-59e6-471e-9564-387d2c572c9f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.011693 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.011675 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dh9m\" (UniqueName: \"kubernetes.io/projected/0a099632-59e6-471e-9564-387d2c572c9f-kube-api-access-9dh9m\") pod \"telemeter-client-695f97b69f-4zff8\" (UID: \"0a099632-59e6-471e-9564-387d2c572c9f\") " pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.072744 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.072707 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" Apr 23 09:32:30.249184 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.249138 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-695f97b69f-4zff8"] Apr 23 09:32:30.253442 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:30.253372 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a099632_59e6_471e_9564_387d2c572c9f.slice/crio-6056e7a5692a2b9b38b6a7379fd19e63c513705e43ee3ffb08c39308dcf3387c WatchSource:0}: Error finding container 6056e7a5692a2b9b38b6a7379fd19e63c513705e43ee3ffb08c39308dcf3387c: Status 404 returned error can't find the container with id 6056e7a5692a2b9b38b6a7379fd19e63c513705e43ee3ffb08c39308dcf3387c Apr 23 09:32:30.820285 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.820071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f9deca1-8e6e-4a72-8f05-18ee01034792","Type":"ContainerStarted","Data":"9b5462ee437be17d73534434c0261c5d16d9023b1d22906e46819ca3e21e25dc"} Apr 23 09:32:30.824406 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.824346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75bb568db7-wtvw8" event={"ID":"dfca5a52-7560-4f74-88c0-1ea067add34f","Type":"ContainerStarted","Data":"bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1"} Apr 23 09:32:30.824406 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.824382 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75bb568db7-wtvw8" event={"ID":"dfca5a52-7560-4f74-88c0-1ea067add34f","Type":"ContainerStarted","Data":"76566ce10491cd74aaab5eb70e81d0aea49dc5e94986df535342479dea429e51"} Apr 23 09:32:30.826284 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.826253 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" event={"ID":"0a099632-59e6-471e-9564-387d2c572c9f","Type":"ContainerStarted","Data":"6056e7a5692a2b9b38b6a7379fd19e63c513705e43ee3ffb08c39308dcf3387c"} Apr 23 09:32:30.847515 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.847472 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.9546986400000002 podStartE2EDuration="5.847458425s" podCreationTimestamp="2026-04-23 09:32:25 +0000 UTC" firstStartedPulling="2026-04-23 09:32:26.241231352 +0000 UTC m=+104.396326299" lastFinishedPulling="2026-04-23 09:32:30.133991133 +0000 UTC m=+108.289086084" observedRunningTime="2026-04-23 09:32:30.846252949 +0000 UTC m=+109.001347928" watchObservedRunningTime="2026-04-23 09:32:30.847458425 +0000 UTC m=+109.002553394" Apr 23 09:32:30.867728 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:30.867674 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75bb568db7-wtvw8" podStartSLOduration=1.8676533910000002 podStartE2EDuration="1.867653391s" podCreationTimestamp="2026-04-23 09:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:32:30.866233235 +0000 UTC m=+109.021328240" watchObservedRunningTime="2026-04-23 09:32:30.867653391 +0000 UTC m=+109.022748361" Apr 23 09:32:31.831526 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:31.831483 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" event={"ID":"3dd24c38-657c-425c-a27a-09e527b26e51","Type":"ContainerStarted","Data":"5f197274a7d4991d42ac661c62ee01bbc2a58d818e90b15cb49db28a006949cd"} Apr 23 09:32:31.848836 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:31.848758 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" podStartSLOduration=2.388918258 podStartE2EDuration="3.848743968s" podCreationTimestamp="2026-04-23 09:32:28 +0000 UTC" firstStartedPulling="2026-04-23 09:32:29.327532752 +0000 UTC m=+107.482627699" lastFinishedPulling="2026-04-23 09:32:30.787358451 +0000 UTC m=+108.942453409" observedRunningTime="2026-04-23 09:32:31.847648501 +0000 UTC m=+110.002743469" watchObservedRunningTime="2026-04-23 09:32:31.848743968 +0000 UTC m=+110.003838936" Apr 23 09:32:32.835531 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:32.835490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" event={"ID":"0a099632-59e6-471e-9564-387d2c572c9f","Type":"ContainerStarted","Data":"7fee946457d948da1a147e30f601bf7a00583ddd9f02dae382c5d0764f326795"} Apr 23 09:32:32.835935 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:32.835539 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" event={"ID":"0a099632-59e6-471e-9564-387d2c572c9f","Type":"ContainerStarted","Data":"776d703eca9e5d6dbc3259f7cf090e26ad3677dd5f04cff27d9da26948f3cb5d"} Apr 23 09:32:32.835935 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:32.835554 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" event={"ID":"0a099632-59e6-471e-9564-387d2c572c9f","Type":"ContainerStarted","Data":"164ea4ad00d021beb32d9c9f0101ae848fdd1529e638abda86466b896381dc5d"} Apr 23 09:32:32.855360 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:32.855317 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-695f97b69f-4zff8" podStartSLOduration=1.9641709710000002 podStartE2EDuration="3.855303792s" podCreationTimestamp="2026-04-23 09:32:29 +0000 UTC" firstStartedPulling="2026-04-23 09:32:30.255524066 +0000 UTC m=+108.410619012" lastFinishedPulling="2026-04-23 09:32:32.146656884 +0000 UTC m=+110.301751833" observedRunningTime="2026-04-23 09:32:32.854037294 +0000 UTC m=+111.009132262" watchObservedRunningTime="2026-04-23 09:32:32.855303792 +0000 UTC m=+111.010398760" Apr 23 09:32:33.627296 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.627258 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75bb568db7-wtvw8"] Apr 23 09:32:33.666984 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.666956 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cfd4c879c-8qnp4"] Apr 23 09:32:33.669334 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.669313 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.697674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.697648 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cfd4c879c-8qnp4"] Apr 23 09:32:33.838833 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.838795 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-console-config\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.838833 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.838837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-trusted-ca-bundle\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.839267 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.838866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-oauth-config\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.839267 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.838919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-oauth-serving-cert\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.839267 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.838941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-service-ca\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.839267 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.838991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-serving-cert\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.839267 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.839020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvfp5\" (UniqueName: \"kubernetes.io/projected/4568cc00-f05d-486f-be36-a5cfda350c6c-kube-api-access-zvfp5\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.940207 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.940120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-console-config\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.940207 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.940160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-trusted-ca-bundle\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.940436 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.940413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-oauth-config\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.940526 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.940507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-oauth-serving-cert\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.940588 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.940537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-service-ca\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.940640 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.940591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-serving-cert\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.940695 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.940649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvfp5\" (UniqueName: \"kubernetes.io/projected/4568cc00-f05d-486f-be36-a5cfda350c6c-kube-api-access-zvfp5\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.940938 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.940911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-console-config\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.941192 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.941164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-oauth-serving-cert\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.941499 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.941479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-service-ca\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.941945 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.941924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-trusted-ca-bundle\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.943563 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.943521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-serving-cert\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.943563 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.943552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-oauth-config\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.949521 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.949462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvfp5\" (UniqueName: \"kubernetes.io/projected/4568cc00-f05d-486f-be36-a5cfda350c6c-kube-api-access-zvfp5\") pod \"console-cfd4c879c-8qnp4\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:33.977746 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:33.977708 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:34.118933 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:34.118822 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cfd4c879c-8qnp4"] Apr 23 09:32:34.121445 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:32:34.121416 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4568cc00_f05d_486f_be36_a5cfda350c6c.slice/crio-65a76dabbdce73da48f8a67c57e942000a31533b35071cdaef0e2f822283dea0 WatchSource:0}: Error finding container 65a76dabbdce73da48f8a67c57e942000a31533b35071cdaef0e2f822283dea0: Status 404 returned error can't find the container with id 65a76dabbdce73da48f8a67c57e942000a31533b35071cdaef0e2f822283dea0 Apr 23 09:32:34.842873 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:34.842832 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cfd4c879c-8qnp4" event={"ID":"4568cc00-f05d-486f-be36-a5cfda350c6c","Type":"ContainerStarted","Data":"0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a"} Apr 23 09:32:34.842873 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:34.842870 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cfd4c879c-8qnp4" event={"ID":"4568cc00-f05d-486f-be36-a5cfda350c6c","Type":"ContainerStarted","Data":"65a76dabbdce73da48f8a67c57e942000a31533b35071cdaef0e2f822283dea0"} Apr 23 09:32:34.860182 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:34.860137 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cfd4c879c-8qnp4" podStartSLOduration=1.860122992 podStartE2EDuration="1.860122992s" podCreationTimestamp="2026-04-23 09:32:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:32:34.858943682 +0000 UTC m=+113.014038651" watchObservedRunningTime="2026-04-23 09:32:34.860122992 +0000 UTC m=+113.015217962" Apr 23 09:32:39.652975 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:39.652930 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:43.978341 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:43.978300 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:43.978698 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:43.978353 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:43.982940 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:43.982919 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:44.874805 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:44.874760 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:32:44.920574 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:44.920537 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7ccbc4855d-7lnmg"] Apr 23 09:32:49.182174 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:49.182141 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:49.182174 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:49.182181 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:32:49.863205 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:49.863143 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-847d9cd844-8h57x" podUID="914265a0-79d7-4eb7-9d72-6b1a419f229e" containerName="console" containerID="cri-o://64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f" gracePeriod=15 Apr 23 09:32:50.099637 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.099616 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-847d9cd844-8h57x_914265a0-79d7-4eb7-9d72-6b1a419f229e/console/0.log" Apr 23 09:32:50.099748 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.099675 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:50.182602 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.182519 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-serving-cert\") pod \"914265a0-79d7-4eb7-9d72-6b1a419f229e\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " Apr 23 09:32:50.182602 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.182561 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-service-ca\") pod \"914265a0-79d7-4eb7-9d72-6b1a419f229e\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " Apr 23 09:32:50.182602 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.182593 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-oauth-config\") pod \"914265a0-79d7-4eb7-9d72-6b1a419f229e\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " Apr 23 09:32:50.183197 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.182652 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c74gs\" (UniqueName: \"kubernetes.io/projected/914265a0-79d7-4eb7-9d72-6b1a419f229e-kube-api-access-c74gs\") pod \"914265a0-79d7-4eb7-9d72-6b1a419f229e\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " Apr 23 09:32:50.183197 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.182706 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-oauth-serving-cert\") pod \"914265a0-79d7-4eb7-9d72-6b1a419f229e\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " Apr 23 09:32:50.183197 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.182745 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-config\") pod \"914265a0-79d7-4eb7-9d72-6b1a419f229e\" (UID: \"914265a0-79d7-4eb7-9d72-6b1a419f229e\") " Apr 23 09:32:50.183197 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.183011 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-service-ca" (OuterVolumeSpecName: "service-ca") pod "914265a0-79d7-4eb7-9d72-6b1a419f229e" (UID: "914265a0-79d7-4eb7-9d72-6b1a419f229e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:50.183197 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.183121 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "914265a0-79d7-4eb7-9d72-6b1a419f229e" (UID: "914265a0-79d7-4eb7-9d72-6b1a419f229e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:50.183620 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.183212 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-config" (OuterVolumeSpecName: "console-config") pod "914265a0-79d7-4eb7-9d72-6b1a419f229e" (UID: "914265a0-79d7-4eb7-9d72-6b1a419f229e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:50.185021 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.184997 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "914265a0-79d7-4eb7-9d72-6b1a419f229e" (UID: "914265a0-79d7-4eb7-9d72-6b1a419f229e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:32:50.185122 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.185018 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "914265a0-79d7-4eb7-9d72-6b1a419f229e" (UID: "914265a0-79d7-4eb7-9d72-6b1a419f229e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:32:50.185122 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.185078 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/914265a0-79d7-4eb7-9d72-6b1a419f229e-kube-api-access-c74gs" (OuterVolumeSpecName: "kube-api-access-c74gs") pod "914265a0-79d7-4eb7-9d72-6b1a419f229e" (UID: "914265a0-79d7-4eb7-9d72-6b1a419f229e"). InnerVolumeSpecName "kube-api-access-c74gs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:32:50.283596 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.283549 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-service-ca\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:50.283596 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.283591 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-oauth-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:50.283596 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.283602 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c74gs\" (UniqueName: \"kubernetes.io/projected/914265a0-79d7-4eb7-9d72-6b1a419f229e-kube-api-access-c74gs\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:50.283596 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.283611 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-oauth-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:50.283596 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.283620 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:50.283965 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.283629 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914265a0-79d7-4eb7-9d72-6b1a419f229e-console-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:50.890117 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.890087 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-847d9cd844-8h57x_914265a0-79d7-4eb7-9d72-6b1a419f229e/console/0.log" Apr 23 09:32:50.890285 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.890126 2577 generic.go:358] "Generic (PLEG): container finished" podID="914265a0-79d7-4eb7-9d72-6b1a419f229e" containerID="64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f" exitCode=2 Apr 23 09:32:50.890285 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.890163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847d9cd844-8h57x" event={"ID":"914265a0-79d7-4eb7-9d72-6b1a419f229e","Type":"ContainerDied","Data":"64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f"} Apr 23 09:32:50.890285 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.890193 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847d9cd844-8h57x" Apr 23 09:32:50.890285 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.890200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847d9cd844-8h57x" event={"ID":"914265a0-79d7-4eb7-9d72-6b1a419f229e","Type":"ContainerDied","Data":"e824bf56d6f5effad27660950ed4b2f71f18102dbe8e8f1784b308f86f6fce60"} Apr 23 09:32:50.890285 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.890215 2577 scope.go:117] "RemoveContainer" containerID="64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f" Apr 23 09:32:50.897885 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.897867 2577 scope.go:117] "RemoveContainer" containerID="64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f" Apr 23 09:32:50.898132 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:50.898114 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f\": container with ID starting with 64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f not found: ID does not exist" containerID="64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f" Apr 23 09:32:50.898187 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.898140 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f"} err="failed to get container status \"64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f\": rpc error: code = NotFound desc = could not find container \"64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f\": container with ID starting with 64af1bc43d276c06dfa38d9987e365fec6c66a7a6bbc742412281c1cb98b6c3f not found: ID does not exist" Apr 23 09:32:50.907698 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.907674 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-847d9cd844-8h57x"] Apr 23 09:32:50.911165 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:50.911142 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-847d9cd844-8h57x"] Apr 23 09:32:52.407574 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:52.407544 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="914265a0-79d7-4eb7-9d72-6b1a419f229e" path="/var/lib/kubelet/pods/914265a0-79d7-4eb7-9d72-6b1a419f229e/volumes" Apr 23 09:32:58.646583 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.646536 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-75bb568db7-wtvw8" podUID="dfca5a52-7560-4f74-88c0-1ea067add34f" containerName="console" containerID="cri-o://bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1" gracePeriod=15 Apr 23 09:32:58.882111 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.882086 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75bb568db7-wtvw8_dfca5a52-7560-4f74-88c0-1ea067add34f/console/0.log" Apr 23 09:32:58.882217 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.882147 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:58.912427 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.912365 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75bb568db7-wtvw8_dfca5a52-7560-4f74-88c0-1ea067add34f/console/0.log" Apr 23 09:32:58.912427 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.912402 2577 generic.go:358] "Generic (PLEG): container finished" podID="dfca5a52-7560-4f74-88c0-1ea067add34f" containerID="bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1" exitCode=2 Apr 23 09:32:58.912614 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.912436 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75bb568db7-wtvw8" event={"ID":"dfca5a52-7560-4f74-88c0-1ea067add34f","Type":"ContainerDied","Data":"bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1"} Apr 23 09:32:58.912614 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.912481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75bb568db7-wtvw8" event={"ID":"dfca5a52-7560-4f74-88c0-1ea067add34f","Type":"ContainerDied","Data":"76566ce10491cd74aaab5eb70e81d0aea49dc5e94986df535342479dea429e51"} Apr 23 09:32:58.912614 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.912483 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75bb568db7-wtvw8" Apr 23 09:32:58.912614 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.912540 2577 scope.go:117] "RemoveContainer" containerID="bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1" Apr 23 09:32:58.920407 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.920390 2577 scope.go:117] "RemoveContainer" containerID="bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1" Apr 23 09:32:58.920660 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:32:58.920639 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1\": container with ID starting with bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1 not found: ID does not exist" containerID="bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1" Apr 23 09:32:58.920710 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.920668 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1"} err="failed to get container status \"bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1\": rpc error: code = NotFound desc = could not find container \"bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1\": container with ID starting with bd68e89f9badaceebcc7da67d49e81aac2bde2a589ed7c98327e5d4c5b37d5e1 not found: ID does not exist" Apr 23 09:32:58.951539 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.951516 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-serving-cert\") pod \"dfca5a52-7560-4f74-88c0-1ea067add34f\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " Apr 23 09:32:58.951644 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.951543 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-oauth-config\") pod \"dfca5a52-7560-4f74-88c0-1ea067add34f\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " Apr 23 09:32:58.951644 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.951562 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-trusted-ca-bundle\") pod \"dfca5a52-7560-4f74-88c0-1ea067add34f\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " Apr 23 09:32:58.951644 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.951593 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8mmd\" (UniqueName: \"kubernetes.io/projected/dfca5a52-7560-4f74-88c0-1ea067add34f-kube-api-access-r8mmd\") pod \"dfca5a52-7560-4f74-88c0-1ea067add34f\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " Apr 23 09:32:58.951644 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.951611 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-oauth-serving-cert\") pod \"dfca5a52-7560-4f74-88c0-1ea067add34f\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " Apr 23 09:32:58.951644 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.951638 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-console-config\") pod \"dfca5a52-7560-4f74-88c0-1ea067add34f\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " Apr 23 09:32:58.951881 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.951659 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-service-ca\") pod \"dfca5a52-7560-4f74-88c0-1ea067add34f\" (UID: \"dfca5a52-7560-4f74-88c0-1ea067add34f\") " Apr 23 09:32:58.952193 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.952165 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dfca5a52-7560-4f74-88c0-1ea067add34f" (UID: "dfca5a52-7560-4f74-88c0-1ea067add34f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:58.952537 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.952432 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dfca5a52-7560-4f74-88c0-1ea067add34f" (UID: "dfca5a52-7560-4f74-88c0-1ea067add34f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:58.952692 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.952661 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-service-ca" (OuterVolumeSpecName: "service-ca") pod "dfca5a52-7560-4f74-88c0-1ea067add34f" (UID: "dfca5a52-7560-4f74-88c0-1ea067add34f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:58.952832 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.952731 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-console-config" (OuterVolumeSpecName: "console-config") pod "dfca5a52-7560-4f74-88c0-1ea067add34f" (UID: "dfca5a52-7560-4f74-88c0-1ea067add34f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:58.955868 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.952952 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-oauth-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:58.955868 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.952980 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-console-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:58.955868 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.952998 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-service-ca\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:58.955868 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.953014 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfca5a52-7560-4f74-88c0-1ea067add34f-trusted-ca-bundle\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:58.957596 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.957254 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dfca5a52-7560-4f74-88c0-1ea067add34f" (UID: "dfca5a52-7560-4f74-88c0-1ea067add34f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:32:58.957596 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.957278 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfca5a52-7560-4f74-88c0-1ea067add34f-kube-api-access-r8mmd" (OuterVolumeSpecName: "kube-api-access-r8mmd") pod "dfca5a52-7560-4f74-88c0-1ea067add34f" (UID: "dfca5a52-7560-4f74-88c0-1ea067add34f"). InnerVolumeSpecName "kube-api-access-r8mmd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:32:58.958741 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:58.958714 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dfca5a52-7560-4f74-88c0-1ea067add34f" (UID: "dfca5a52-7560-4f74-88c0-1ea067add34f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:32:59.053966 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:59.053937 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:59.053966 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:59.053963 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfca5a52-7560-4f74-88c0-1ea067add34f-console-oauth-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:59.054106 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:59.053973 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r8mmd\" (UniqueName: \"kubernetes.io/projected/dfca5a52-7560-4f74-88c0-1ea067add34f-kube-api-access-r8mmd\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:32:59.233312 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:59.233263 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75bb568db7-wtvw8"] Apr 23 09:32:59.239837 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:32:59.239815 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-75bb568db7-wtvw8"] Apr 23 09:33:00.408191 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:00.408157 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfca5a52-7560-4f74-88c0-1ea067add34f" path="/var/lib/kubelet/pods/dfca5a52-7560-4f74-88c0-1ea067add34f/volumes" Apr 23 09:33:09.187618 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:09.187586 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:33:09.191389 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:09.191365 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-55c9c986d8-bsgq5" Apr 23 09:33:09.940337 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:09.940264 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7ccbc4855d-7lnmg" podUID="27384921-96c4-40b0-9acc-195969899794" containerName="console" containerID="cri-o://18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001" gracePeriod=15 Apr 23 09:33:10.190174 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.190154 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7ccbc4855d-7lnmg_27384921-96c4-40b0-9acc-195969899794/console/0.log" Apr 23 09:33:10.190481 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.190215 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:33:10.245660 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.245614 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-console-config\") pod \"27384921-96c4-40b0-9acc-195969899794\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " Apr 23 09:33:10.245660 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.245674 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-oauth-config\") pod \"27384921-96c4-40b0-9acc-195969899794\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " Apr 23 09:33:10.245916 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.245701 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-oauth-serving-cert\") pod \"27384921-96c4-40b0-9acc-195969899794\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " Apr 23 09:33:10.245916 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.245722 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-serving-cert\") pod \"27384921-96c4-40b0-9acc-195969899794\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " Apr 23 09:33:10.245916 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.245750 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hchqm\" (UniqueName: \"kubernetes.io/projected/27384921-96c4-40b0-9acc-195969899794-kube-api-access-hchqm\") pod \"27384921-96c4-40b0-9acc-195969899794\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " Apr 23 09:33:10.245916 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.245853 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-service-ca\") pod \"27384921-96c4-40b0-9acc-195969899794\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " Apr 23 09:33:10.245916 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.245887 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-trusted-ca-bundle\") pod \"27384921-96c4-40b0-9acc-195969899794\" (UID: \"27384921-96c4-40b0-9acc-195969899794\") " Apr 23 09:33:10.246154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.246028 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-console-config" (OuterVolumeSpecName: "console-config") pod "27384921-96c4-40b0-9acc-195969899794" (UID: "27384921-96c4-40b0-9acc-195969899794"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:33:10.246154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.246132 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-console-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:33:10.246154 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.246134 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "27384921-96c4-40b0-9acc-195969899794" (UID: "27384921-96c4-40b0-9acc-195969899794"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:33:10.246461 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.246397 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "27384921-96c4-40b0-9acc-195969899794" (UID: "27384921-96c4-40b0-9acc-195969899794"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:33:10.246461 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.246405 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-service-ca" (OuterVolumeSpecName: "service-ca") pod "27384921-96c4-40b0-9acc-195969899794" (UID: "27384921-96c4-40b0-9acc-195969899794"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:33:10.248040 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.248013 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27384921-96c4-40b0-9acc-195969899794-kube-api-access-hchqm" (OuterVolumeSpecName: "kube-api-access-hchqm") pod "27384921-96c4-40b0-9acc-195969899794" (UID: "27384921-96c4-40b0-9acc-195969899794"). InnerVolumeSpecName "kube-api-access-hchqm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:33:10.248040 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.248028 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "27384921-96c4-40b0-9acc-195969899794" (UID: "27384921-96c4-40b0-9acc-195969899794"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:33:10.248168 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.248094 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "27384921-96c4-40b0-9acc-195969899794" (UID: "27384921-96c4-40b0-9acc-195969899794"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:33:10.347344 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.347281 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-trusted-ca-bundle\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:33:10.347344 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.347339 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-oauth-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:33:10.347344 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.347348 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-oauth-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:33:10.347344 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.347358 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27384921-96c4-40b0-9acc-195969899794-console-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:33:10.347587 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.347367 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hchqm\" (UniqueName: \"kubernetes.io/projected/27384921-96c4-40b0-9acc-195969899794-kube-api-access-hchqm\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:33:10.347587 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.347377 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27384921-96c4-40b0-9acc-195969899794-service-ca\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:33:10.950122 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.950093 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7ccbc4855d-7lnmg_27384921-96c4-40b0-9acc-195969899794/console/0.log" Apr 23 09:33:10.950375 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.950132 2577 generic.go:358] "Generic (PLEG): container finished" podID="27384921-96c4-40b0-9acc-195969899794" containerID="18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001" exitCode=2 Apr 23 09:33:10.950375 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.950167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ccbc4855d-7lnmg" event={"ID":"27384921-96c4-40b0-9acc-195969899794","Type":"ContainerDied","Data":"18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001"} Apr 23 09:33:10.950375 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.950196 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ccbc4855d-7lnmg" Apr 23 09:33:10.950375 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.950205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ccbc4855d-7lnmg" event={"ID":"27384921-96c4-40b0-9acc-195969899794","Type":"ContainerDied","Data":"bb8e40fcef1e10dc3377f9440f1fccd521205b74c202323852efa8c826fab94f"} Apr 23 09:33:10.950375 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.950224 2577 scope.go:117] "RemoveContainer" containerID="18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001" Apr 23 09:33:10.958162 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.958150 2577 scope.go:117] "RemoveContainer" containerID="18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001" Apr 23 09:33:10.958402 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:33:10.958379 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001\": container with ID starting with 18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001 not found: ID does not exist" containerID="18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001" Apr 23 09:33:10.958483 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.958410 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001"} err="failed to get container status \"18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001\": rpc error: code = NotFound desc = could not find container \"18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001\": container with ID starting with 18bef4353c13b1f7fc4e7b3c98c766767d9b8c8d631e93b3f7c0619f6a975001 not found: ID does not exist" Apr 23 09:33:10.967603 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.967570 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7ccbc4855d-7lnmg"] Apr 23 09:33:10.971761 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:10.971743 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7ccbc4855d-7lnmg"] Apr 23 09:33:12.407785 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:12.407743 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27384921-96c4-40b0-9acc-195969899794" path="/var/lib/kubelet/pods/27384921-96c4-40b0-9acc-195969899794/volumes" Apr 23 09:33:44.306963 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.306926 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b4696bc44-n22hx"] Apr 23 09:33:44.307371 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.307221 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="914265a0-79d7-4eb7-9d72-6b1a419f229e" containerName="console" Apr 23 09:33:44.307371 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.307235 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="914265a0-79d7-4eb7-9d72-6b1a419f229e" containerName="console" Apr 23 09:33:44.307371 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.307250 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27384921-96c4-40b0-9acc-195969899794" containerName="console" Apr 23 09:33:44.307371 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.307256 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="27384921-96c4-40b0-9acc-195969899794" containerName="console" Apr 23 09:33:44.307371 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.307265 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfca5a52-7560-4f74-88c0-1ea067add34f" containerName="console" Apr 23 09:33:44.307371 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.307270 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfca5a52-7560-4f74-88c0-1ea067add34f" containerName="console" Apr 23 09:33:44.307371 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.307313 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="914265a0-79d7-4eb7-9d72-6b1a419f229e" containerName="console" Apr 23 09:33:44.307371 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.307321 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfca5a52-7560-4f74-88c0-1ea067add34f" containerName="console" Apr 23 09:33:44.307371 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.307329 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="27384921-96c4-40b0-9acc-195969899794" containerName="console" Apr 23 09:33:44.309971 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.309955 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.318386 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.318361 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b4696bc44-n22hx"] Apr 23 09:33:44.410538 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.410505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-oauth-serving-cert\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.410698 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.410565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-serving-cert\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.410698 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.410604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-service-ca\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.410698 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.410621 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh22b\" (UniqueName: \"kubernetes.io/projected/c5fa65b7-aa69-4190-842b-296102192184-kube-api-access-jh22b\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.410698 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.410648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-trusted-ca-bundle\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.410698 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.410673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-oauth-config\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.410698 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.410689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-console-config\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.511362 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.511332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-oauth-serving-cert\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.511490 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.511428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-serving-cert\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.511490 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.511462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-service-ca\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.511589 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.511487 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh22b\" (UniqueName: \"kubernetes.io/projected/c5fa65b7-aa69-4190-842b-296102192184-kube-api-access-jh22b\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.511701 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.511669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-trusted-ca-bundle\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.512001 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.511977 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-oauth-config\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.512121 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.512017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-console-config\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.512179 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.512115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-oauth-serving-cert\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.512237 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.512179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-service-ca\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.512495 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.512469 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-trusted-ca-bundle\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.512615 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.512599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-console-config\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.514220 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.514205 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-serving-cert\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.514596 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.514572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-oauth-config\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.518630 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.518607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh22b\" (UniqueName: \"kubernetes.io/projected/c5fa65b7-aa69-4190-842b-296102192184-kube-api-access-jh22b\") pod \"console-b4696bc44-n22hx\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.619743 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.619659 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:44.741654 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:44.741503 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b4696bc44-n22hx"] Apr 23 09:33:44.743837 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:33:44.743803 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5fa65b7_aa69_4190_842b_296102192184.slice/crio-fbc0d883768ae4fe31219cc6e17b9ef37140698680d32c7b50ef0c050670d137 WatchSource:0}: Error finding container fbc0d883768ae4fe31219cc6e17b9ef37140698680d32c7b50ef0c050670d137: Status 404 returned error can't find the container with id fbc0d883768ae4fe31219cc6e17b9ef37140698680d32c7b50ef0c050670d137 Apr 23 09:33:45.051204 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:45.051167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b4696bc44-n22hx" event={"ID":"c5fa65b7-aa69-4190-842b-296102192184","Type":"ContainerStarted","Data":"b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707"} Apr 23 09:33:45.051204 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:45.051206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b4696bc44-n22hx" event={"ID":"c5fa65b7-aa69-4190-842b-296102192184","Type":"ContainerStarted","Data":"fbc0d883768ae4fe31219cc6e17b9ef37140698680d32c7b50ef0c050670d137"} Apr 23 09:33:45.071674 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:45.071620 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b4696bc44-n22hx" podStartSLOduration=1.071601427 podStartE2EDuration="1.071601427s" podCreationTimestamp="2026-04-23 09:33:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:33:45.069470247 +0000 UTC m=+183.224565215" watchObservedRunningTime="2026-04-23 09:33:45.071601427 +0000 UTC m=+183.226696397" Apr 23 09:33:54.619829 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:54.619768 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:54.619829 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:54.619841 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:54.624505 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:54.624475 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:55.080650 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:55.080620 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:33:55.122675 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:33:55.122637 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cfd4c879c-8qnp4"] Apr 23 09:34:20.142147 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.142105 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cfd4c879c-8qnp4" podUID="4568cc00-f05d-486f-be36-a5cfda350c6c" containerName="console" containerID="cri-o://0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a" gracePeriod=15 Apr 23 09:34:20.373367 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.373346 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cfd4c879c-8qnp4_4568cc00-f05d-486f-be36-a5cfda350c6c/console/0.log" Apr 23 09:34:20.373475 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.373406 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:34:20.497835 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.497805 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-trusted-ca-bundle\") pod \"4568cc00-f05d-486f-be36-a5cfda350c6c\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " Apr 23 09:34:20.497994 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.497853 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvfp5\" (UniqueName: \"kubernetes.io/projected/4568cc00-f05d-486f-be36-a5cfda350c6c-kube-api-access-zvfp5\") pod \"4568cc00-f05d-486f-be36-a5cfda350c6c\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " Apr 23 09:34:20.497994 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.497892 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-oauth-serving-cert\") pod \"4568cc00-f05d-486f-be36-a5cfda350c6c\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " Apr 23 09:34:20.497994 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.497938 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-console-config\") pod \"4568cc00-f05d-486f-be36-a5cfda350c6c\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " Apr 23 09:34:20.497994 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.497963 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-serving-cert\") pod \"4568cc00-f05d-486f-be36-a5cfda350c6c\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " Apr 23 09:34:20.498199 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.498060 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-service-ca\") pod \"4568cc00-f05d-486f-be36-a5cfda350c6c\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " Apr 23 09:34:20.498199 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.498121 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-oauth-config\") pod \"4568cc00-f05d-486f-be36-a5cfda350c6c\" (UID: \"4568cc00-f05d-486f-be36-a5cfda350c6c\") " Apr 23 09:34:20.498311 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.498252 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4568cc00-f05d-486f-be36-a5cfda350c6c" (UID: "4568cc00-f05d-486f-be36-a5cfda350c6c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:34:20.498366 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.498308 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4568cc00-f05d-486f-be36-a5cfda350c6c" (UID: "4568cc00-f05d-486f-be36-a5cfda350c6c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:34:20.498366 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.498355 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-service-ca" (OuterVolumeSpecName: "service-ca") pod "4568cc00-f05d-486f-be36-a5cfda350c6c" (UID: "4568cc00-f05d-486f-be36-a5cfda350c6c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:34:20.498447 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.498402 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-oauth-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:20.498447 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.498396 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-console-config" (OuterVolumeSpecName: "console-config") pod "4568cc00-f05d-486f-be36-a5cfda350c6c" (UID: "4568cc00-f05d-486f-be36-a5cfda350c6c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:34:20.498447 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.498413 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-service-ca\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:20.498447 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.498423 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-trusted-ca-bundle\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:20.500238 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.500209 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4568cc00-f05d-486f-be36-a5cfda350c6c" (UID: "4568cc00-f05d-486f-be36-a5cfda350c6c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:34:20.500336 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.500241 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4568cc00-f05d-486f-be36-a5cfda350c6c" (UID: "4568cc00-f05d-486f-be36-a5cfda350c6c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:34:20.500336 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.500269 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4568cc00-f05d-486f-be36-a5cfda350c6c-kube-api-access-zvfp5" (OuterVolumeSpecName: "kube-api-access-zvfp5") pod "4568cc00-f05d-486f-be36-a5cfda350c6c" (UID: "4568cc00-f05d-486f-be36-a5cfda350c6c"). InnerVolumeSpecName "kube-api-access-zvfp5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:34:20.599328 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.599299 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvfp5\" (UniqueName: \"kubernetes.io/projected/4568cc00-f05d-486f-be36-a5cfda350c6c-kube-api-access-zvfp5\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:20.599328 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.599322 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4568cc00-f05d-486f-be36-a5cfda350c6c-console-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:20.599328 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.599332 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:20.599532 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:20.599341 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4568cc00-f05d-486f-be36-a5cfda350c6c-console-oauth-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:21.147574 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.147543 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cfd4c879c-8qnp4_4568cc00-f05d-486f-be36-a5cfda350c6c/console/0.log" Apr 23 09:34:21.148050 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.147589 2577 generic.go:358] "Generic (PLEG): container finished" podID="4568cc00-f05d-486f-be36-a5cfda350c6c" containerID="0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a" exitCode=2 Apr 23 09:34:21.148050 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.147662 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cfd4c879c-8qnp4" Apr 23 09:34:21.148050 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.147670 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cfd4c879c-8qnp4" event={"ID":"4568cc00-f05d-486f-be36-a5cfda350c6c","Type":"ContainerDied","Data":"0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a"} Apr 23 09:34:21.148050 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.147707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cfd4c879c-8qnp4" event={"ID":"4568cc00-f05d-486f-be36-a5cfda350c6c","Type":"ContainerDied","Data":"65a76dabbdce73da48f8a67c57e942000a31533b35071cdaef0e2f822283dea0"} Apr 23 09:34:21.148050 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.147728 2577 scope.go:117] "RemoveContainer" containerID="0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a" Apr 23 09:34:21.157100 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.157079 2577 scope.go:117] "RemoveContainer" containerID="0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a" Apr 23 09:34:21.157406 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:34:21.157384 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a\": container with ID starting with 0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a not found: ID does not exist" containerID="0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a" Apr 23 09:34:21.157495 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.157427 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a"} err="failed to get container status \"0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a\": rpc error: code = NotFound desc = could not find container \"0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a\": container with ID starting with 0e48e22323a478c95fc7e582f916ed1a4591fe5dabb5a780d9e54c8b1382088a not found: ID does not exist" Apr 23 09:34:21.170350 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.170325 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cfd4c879c-8qnp4"] Apr 23 09:34:21.174439 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:21.174421 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cfd4c879c-8qnp4"] Apr 23 09:34:22.408144 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:22.408105 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4568cc00-f05d-486f-be36-a5cfda350c6c" path="/var/lib/kubelet/pods/4568cc00-f05d-486f-be36-a5cfda350c6c/volumes" Apr 23 09:34:31.702464 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.702432 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd"] Apr 23 09:34:31.702981 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.702833 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4568cc00-f05d-486f-be36-a5cfda350c6c" containerName="console" Apr 23 09:34:31.702981 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.702850 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4568cc00-f05d-486f-be36-a5cfda350c6c" containerName="console" Apr 23 09:34:31.702981 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.702944 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4568cc00-f05d-486f-be36-a5cfda350c6c" containerName="console" Apr 23 09:34:31.704992 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.704969 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:31.707445 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.707425 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 09:34:31.707525 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.707425 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fwp7x\"" Apr 23 09:34:31.708203 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.708189 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 09:34:31.715824 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.715804 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd"] Apr 23 09:34:31.786193 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.786165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:31.786193 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.786202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28q8v\" (UniqueName: \"kubernetes.io/projected/169761b1-2591-4ed8-858a-8077ce5bba57-kube-api-access-28q8v\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:31.786382 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.786279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:31.887506 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.887458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:31.887506 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.887515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28q8v\" (UniqueName: \"kubernetes.io/projected/169761b1-2591-4ed8-858a-8077ce5bba57-kube-api-access-28q8v\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:31.887702 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.887560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:31.887958 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.887940 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:31.888003 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.887979 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:31.896483 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:31.896461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28q8v\" (UniqueName: \"kubernetes.io/projected/169761b1-2591-4ed8-858a-8077ce5bba57-kube-api-access-28q8v\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:32.014234 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:32.014206 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:32.132549 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:32.132523 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd"] Apr 23 09:34:32.135260 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:34:32.135229 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod169761b1_2591_4ed8_858a_8077ce5bba57.slice/crio-a00dfa9157a510a103c0731bf685114dd3ad5671bdeae557d7658a680af95bfe WatchSource:0}: Error finding container a00dfa9157a510a103c0731bf685114dd3ad5671bdeae557d7658a680af95bfe: Status 404 returned error can't find the container with id a00dfa9157a510a103c0731bf685114dd3ad5671bdeae557d7658a680af95bfe Apr 23 09:34:32.185229 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:32.185196 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" event={"ID":"169761b1-2591-4ed8-858a-8077ce5bba57","Type":"ContainerStarted","Data":"a00dfa9157a510a103c0731bf685114dd3ad5671bdeae557d7658a680af95bfe"} Apr 23 09:34:37.201262 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:37.201212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" event={"ID":"169761b1-2591-4ed8-858a-8077ce5bba57","Type":"ContainerStarted","Data":"1ea2f1927b706290d6aa86a893ef982322b3870443a099c553200288053506ce"} Apr 23 09:34:38.206273 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:38.206238 2577 generic.go:358] "Generic (PLEG): container finished" podID="169761b1-2591-4ed8-858a-8077ce5bba57" containerID="1ea2f1927b706290d6aa86a893ef982322b3870443a099c553200288053506ce" exitCode=0 Apr 23 09:34:38.206726 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:38.206325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" event={"ID":"169761b1-2591-4ed8-858a-8077ce5bba57","Type":"ContainerDied","Data":"1ea2f1927b706290d6aa86a893ef982322b3870443a099c553200288053506ce"} Apr 23 09:34:41.216876 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:41.216834 2577 generic.go:358] "Generic (PLEG): container finished" podID="169761b1-2591-4ed8-858a-8077ce5bba57" containerID="5043473a261e38de110b67495862877dfb9ddb64ae05b1c6edd677ec9c583af4" exitCode=0 Apr 23 09:34:41.217252 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:41.216922 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" event={"ID":"169761b1-2591-4ed8-858a-8077ce5bba57","Type":"ContainerDied","Data":"5043473a261e38de110b67495862877dfb9ddb64ae05b1c6edd677ec9c583af4"} Apr 23 09:34:48.239295 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:48.239260 2577 generic.go:358] "Generic (PLEG): container finished" podID="169761b1-2591-4ed8-858a-8077ce5bba57" containerID="10dbbe37567fd3a5672373e22ccb075ac9008f6a796dd13123ac937e26036a98" exitCode=0 Apr 23 09:34:48.239655 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:48.239351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" event={"ID":"169761b1-2591-4ed8-858a-8077ce5bba57","Type":"ContainerDied","Data":"10dbbe37567fd3a5672373e22ccb075ac9008f6a796dd13123ac937e26036a98"} Apr 23 09:34:49.375507 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.375483 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:49.546556 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.546466 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28q8v\" (UniqueName: \"kubernetes.io/projected/169761b1-2591-4ed8-858a-8077ce5bba57-kube-api-access-28q8v\") pod \"169761b1-2591-4ed8-858a-8077ce5bba57\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " Apr 23 09:34:49.546556 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.546541 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-util\") pod \"169761b1-2591-4ed8-858a-8077ce5bba57\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " Apr 23 09:34:49.546739 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.546587 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-bundle\") pod \"169761b1-2591-4ed8-858a-8077ce5bba57\" (UID: \"169761b1-2591-4ed8-858a-8077ce5bba57\") " Apr 23 09:34:49.547257 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.547219 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-bundle" (OuterVolumeSpecName: "bundle") pod "169761b1-2591-4ed8-858a-8077ce5bba57" (UID: "169761b1-2591-4ed8-858a-8077ce5bba57"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:34:49.548967 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.548938 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/169761b1-2591-4ed8-858a-8077ce5bba57-kube-api-access-28q8v" (OuterVolumeSpecName: "kube-api-access-28q8v") pod "169761b1-2591-4ed8-858a-8077ce5bba57" (UID: "169761b1-2591-4ed8-858a-8077ce5bba57"). InnerVolumeSpecName "kube-api-access-28q8v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:34:49.552037 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.551998 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-util" (OuterVolumeSpecName: "util") pod "169761b1-2591-4ed8-858a-8077ce5bba57" (UID: "169761b1-2591-4ed8-858a-8077ce5bba57"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:34:49.648070 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.648025 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28q8v\" (UniqueName: \"kubernetes.io/projected/169761b1-2591-4ed8-858a-8077ce5bba57-kube-api-access-28q8v\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:49.648070 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.648055 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-util\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:49.648070 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:49.648065 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/169761b1-2591-4ed8-858a-8077ce5bba57-bundle\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:34:50.252506 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:50.252473 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" event={"ID":"169761b1-2591-4ed8-858a-8077ce5bba57","Type":"ContainerDied","Data":"a00dfa9157a510a103c0731bf685114dd3ad5671bdeae557d7658a680af95bfe"} Apr 23 09:34:50.252506 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:50.252503 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a00dfa9157a510a103c0731bf685114dd3ad5671bdeae557d7658a680af95bfe" Apr 23 09:34:50.252506 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:50.252508 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dr4pwd" Apr 23 09:34:54.702926 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.702845 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht"] Apr 23 09:34:54.703396 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.703133 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="169761b1-2591-4ed8-858a-8077ce5bba57" containerName="pull" Apr 23 09:34:54.703396 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.703144 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="169761b1-2591-4ed8-858a-8077ce5bba57" containerName="pull" Apr 23 09:34:54.703396 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.703152 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="169761b1-2591-4ed8-858a-8077ce5bba57" containerName="extract" Apr 23 09:34:54.703396 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.703158 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="169761b1-2591-4ed8-858a-8077ce5bba57" containerName="extract" Apr 23 09:34:54.703396 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.703168 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="169761b1-2591-4ed8-858a-8077ce5bba57" containerName="util" Apr 23 09:34:54.703396 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.703174 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="169761b1-2591-4ed8-858a-8077ce5bba57" containerName="util" Apr 23 09:34:54.703396 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.703209 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="169761b1-2591-4ed8-858a-8077ce5bba57" containerName="extract" Apr 23 09:34:54.748270 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.748245 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht"] Apr 23 09:34:54.748420 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.748349 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" Apr 23 09:34:54.751069 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.751047 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:34:54.751224 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.751118 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 23 09:34:54.751224 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.751162 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-6nvdr\"" Apr 23 09:34:54.778604 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.778579 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f174a1a1-7caf-4d50-8602-97d032201bf9-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-phpht\" (UID: \"f174a1a1-7caf-4d50-8602-97d032201bf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" Apr 23 09:34:54.778719 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.778617 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4k2h\" (UniqueName: \"kubernetes.io/projected/f174a1a1-7caf-4d50-8602-97d032201bf9-kube-api-access-z4k2h\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-phpht\" (UID: \"f174a1a1-7caf-4d50-8602-97d032201bf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" Apr 23 09:34:54.879100 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.879062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4k2h\" (UniqueName: \"kubernetes.io/projected/f174a1a1-7caf-4d50-8602-97d032201bf9-kube-api-access-z4k2h\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-phpht\" (UID: \"f174a1a1-7caf-4d50-8602-97d032201bf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" Apr 23 09:34:54.879205 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.879147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f174a1a1-7caf-4d50-8602-97d032201bf9-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-phpht\" (UID: \"f174a1a1-7caf-4d50-8602-97d032201bf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" Apr 23 09:34:54.879467 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.879451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f174a1a1-7caf-4d50-8602-97d032201bf9-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-phpht\" (UID: \"f174a1a1-7caf-4d50-8602-97d032201bf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" Apr 23 09:34:54.900464 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:54.900434 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4k2h\" (UniqueName: \"kubernetes.io/projected/f174a1a1-7caf-4d50-8602-97d032201bf9-kube-api-access-z4k2h\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-phpht\" (UID: \"f174a1a1-7caf-4d50-8602-97d032201bf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" Apr 23 09:34:55.057905 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:55.057876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" Apr 23 09:34:55.182758 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:55.182734 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht"] Apr 23 09:34:55.185474 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:34:55.185441 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf174a1a1_7caf_4d50_8602_97d032201bf9.slice/crio-40b6e4a16109fe7d7d6e00f391ee6c69fe68678cc11193b62b544a021a211090 WatchSource:0}: Error finding container 40b6e4a16109fe7d7d6e00f391ee6c69fe68678cc11193b62b544a021a211090: Status 404 returned error can't find the container with id 40b6e4a16109fe7d7d6e00f391ee6c69fe68678cc11193b62b544a021a211090 Apr 23 09:34:55.268184 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:55.268148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" event={"ID":"f174a1a1-7caf-4d50-8602-97d032201bf9","Type":"ContainerStarted","Data":"40b6e4a16109fe7d7d6e00f391ee6c69fe68678cc11193b62b544a021a211090"} Apr 23 09:34:58.279073 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:58.279035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" event={"ID":"f174a1a1-7caf-4d50-8602-97d032201bf9","Type":"ContainerStarted","Data":"e74a01364ae3ba0f7ec9454d844776eca51fe53cff21c89d79bc97bc32277cf0"} Apr 23 09:34:58.303593 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:34:58.303541 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-phpht" podStartSLOduration=1.872666713 podStartE2EDuration="4.303526698s" podCreationTimestamp="2026-04-23 09:34:54 +0000 UTC" firstStartedPulling="2026-04-23 09:34:55.187880779 +0000 UTC m=+253.342975726" lastFinishedPulling="2026-04-23 09:34:57.618740764 +0000 UTC m=+255.773835711" observedRunningTime="2026-04-23 09:34:58.302511822 +0000 UTC m=+256.457606791" watchObservedRunningTime="2026-04-23 09:34:58.303526698 +0000 UTC m=+256.458621667" Apr 23 09:35:00.873240 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:00.873202 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-44lm9"] Apr 23 09:35:00.877057 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:00.877037 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:00.879546 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:00.879527 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 09:35:00.880374 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:00.880348 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 09:35:00.880501 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:00.880374 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-7tp67\"" Apr 23 09:35:00.884368 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:00.884350 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-44lm9"] Apr 23 09:35:00.925832 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:00.925796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58016c74-52f5-4df9-ade7-78043b2a8f2a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-44lm9\" (UID: \"58016c74-52f5-4df9-ade7-78043b2a8f2a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:00.925967 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:00.925843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4nxs\" (UniqueName: \"kubernetes.io/projected/58016c74-52f5-4df9-ade7-78043b2a8f2a-kube-api-access-x4nxs\") pod \"cert-manager-webhook-587ccfb98-44lm9\" (UID: \"58016c74-52f5-4df9-ade7-78043b2a8f2a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:01.027148 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:01.027113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58016c74-52f5-4df9-ade7-78043b2a8f2a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-44lm9\" (UID: \"58016c74-52f5-4df9-ade7-78043b2a8f2a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:01.027317 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:01.027153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4nxs\" (UniqueName: \"kubernetes.io/projected/58016c74-52f5-4df9-ade7-78043b2a8f2a-kube-api-access-x4nxs\") pod \"cert-manager-webhook-587ccfb98-44lm9\" (UID: \"58016c74-52f5-4df9-ade7-78043b2a8f2a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:01.036250 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:01.036220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58016c74-52f5-4df9-ade7-78043b2a8f2a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-44lm9\" (UID: \"58016c74-52f5-4df9-ade7-78043b2a8f2a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:01.036353 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:01.036291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4nxs\" (UniqueName: \"kubernetes.io/projected/58016c74-52f5-4df9-ade7-78043b2a8f2a-kube-api-access-x4nxs\") pod \"cert-manager-webhook-587ccfb98-44lm9\" (UID: \"58016c74-52f5-4df9-ade7-78043b2a8f2a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:01.196922 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:01.196825 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:01.314686 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:01.314657 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-44lm9"] Apr 23 09:35:01.317942 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:35:01.317898 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58016c74_52f5_4df9_ade7_78043b2a8f2a.slice/crio-f82f47378e165964c763a0b17b81867081eabecb5283611688af371e34942ec8 WatchSource:0}: Error finding container f82f47378e165964c763a0b17b81867081eabecb5283611688af371e34942ec8: Status 404 returned error can't find the container with id f82f47378e165964c763a0b17b81867081eabecb5283611688af371e34942ec8 Apr 23 09:35:02.293823 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:02.293789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" event={"ID":"58016c74-52f5-4df9-ade7-78043b2a8f2a","Type":"ContainerStarted","Data":"f82f47378e165964c763a0b17b81867081eabecb5283611688af371e34942ec8"} Apr 23 09:35:03.509258 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.509223 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-nk8p7"] Apr 23 09:35:03.512595 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.512576 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" Apr 23 09:35:03.515000 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.514978 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-bd877\"" Apr 23 09:35:03.520542 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.520521 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-nk8p7"] Apr 23 09:35:03.546082 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.546054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gf67\" (UniqueName: \"kubernetes.io/projected/f69cdb8e-484d-42d0-898a-fe08f8da119f-kube-api-access-8gf67\") pod \"cert-manager-cainjector-68b757865b-nk8p7\" (UID: \"f69cdb8e-484d-42d0-898a-fe08f8da119f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" Apr 23 09:35:03.546219 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.546097 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f69cdb8e-484d-42d0-898a-fe08f8da119f-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-nk8p7\" (UID: \"f69cdb8e-484d-42d0-898a-fe08f8da119f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" Apr 23 09:35:03.647121 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.647075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gf67\" (UniqueName: \"kubernetes.io/projected/f69cdb8e-484d-42d0-898a-fe08f8da119f-kube-api-access-8gf67\") pod \"cert-manager-cainjector-68b757865b-nk8p7\" (UID: \"f69cdb8e-484d-42d0-898a-fe08f8da119f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" Apr 23 09:35:03.647304 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.647144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f69cdb8e-484d-42d0-898a-fe08f8da119f-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-nk8p7\" (UID: \"f69cdb8e-484d-42d0-898a-fe08f8da119f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" Apr 23 09:35:03.655846 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.655807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f69cdb8e-484d-42d0-898a-fe08f8da119f-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-nk8p7\" (UID: \"f69cdb8e-484d-42d0-898a-fe08f8da119f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" Apr 23 09:35:03.656117 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.656096 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gf67\" (UniqueName: \"kubernetes.io/projected/f69cdb8e-484d-42d0-898a-fe08f8da119f-kube-api-access-8gf67\") pod \"cert-manager-cainjector-68b757865b-nk8p7\" (UID: \"f69cdb8e-484d-42d0-898a-fe08f8da119f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" Apr 23 09:35:03.823915 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:03.823838 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" Apr 23 09:35:04.668668 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:04.668642 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-nk8p7"] Apr 23 09:35:05.305520 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:05.305480 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" event={"ID":"58016c74-52f5-4df9-ade7-78043b2a8f2a","Type":"ContainerStarted","Data":"e8b7c0dbac194f3c44e24f58735f9c60256299dbbd6f825810abc04dca90e8ff"} Apr 23 09:35:05.305712 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:05.305551 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:05.306766 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:05.306743 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" event={"ID":"f69cdb8e-484d-42d0-898a-fe08f8da119f","Type":"ContainerStarted","Data":"fa829e94119957e2960c470519a8440fdd1d97141c8f09bb0bd62a8318b31bc2"} Apr 23 09:35:05.306876 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:05.306770 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" event={"ID":"f69cdb8e-484d-42d0-898a-fe08f8da119f","Type":"ContainerStarted","Data":"d18091203e52d14086ed52b3bc27926837432bcf459cb9b212bbd73314ae78bb"} Apr 23 09:35:05.320978 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:05.320936 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" podStartSLOduration=2.02747259 podStartE2EDuration="5.320924472s" podCreationTimestamp="2026-04-23 09:35:00 +0000 UTC" firstStartedPulling="2026-04-23 09:35:01.319896682 +0000 UTC m=+259.474991629" lastFinishedPulling="2026-04-23 09:35:04.613348563 +0000 UTC m=+262.768443511" observedRunningTime="2026-04-23 09:35:05.320275737 +0000 UTC m=+263.475370706" watchObservedRunningTime="2026-04-23 09:35:05.320924472 +0000 UTC m=+263.476019441" Apr 23 09:35:05.333722 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:05.333684 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-nk8p7" podStartSLOduration=2.333674774 podStartE2EDuration="2.333674774s" podCreationTimestamp="2026-04-23 09:35:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:35:05.333395456 +0000 UTC m=+263.488490425" watchObservedRunningTime="2026-04-23 09:35:05.333674774 +0000 UTC m=+263.488769743" Apr 23 09:35:11.312699 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:11.312668 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-44lm9" Apr 23 09:35:19.860264 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:19.860234 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-l9nc5"] Apr 23 09:35:19.863449 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:19.863432 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-l9nc5" Apr 23 09:35:19.865852 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:19.865828 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-rzlwg\"" Apr 23 09:35:19.869857 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:19.869834 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-l9nc5"] Apr 23 09:35:19.974405 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:19.974370 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5433a844-6de1-46c7-9d5d-dca6144f2027-bound-sa-token\") pod \"cert-manager-79c8d999ff-l9nc5\" (UID: \"5433a844-6de1-46c7-9d5d-dca6144f2027\") " pod="cert-manager/cert-manager-79c8d999ff-l9nc5" Apr 23 09:35:19.974579 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:19.974452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr46t\" (UniqueName: \"kubernetes.io/projected/5433a844-6de1-46c7-9d5d-dca6144f2027-kube-api-access-jr46t\") pod \"cert-manager-79c8d999ff-l9nc5\" (UID: \"5433a844-6de1-46c7-9d5d-dca6144f2027\") " pod="cert-manager/cert-manager-79c8d999ff-l9nc5" Apr 23 09:35:20.075556 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.075526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr46t\" (UniqueName: \"kubernetes.io/projected/5433a844-6de1-46c7-9d5d-dca6144f2027-kube-api-access-jr46t\") pod \"cert-manager-79c8d999ff-l9nc5\" (UID: \"5433a844-6de1-46c7-9d5d-dca6144f2027\") " pod="cert-manager/cert-manager-79c8d999ff-l9nc5" Apr 23 09:35:20.075698 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.075571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5433a844-6de1-46c7-9d5d-dca6144f2027-bound-sa-token\") pod \"cert-manager-79c8d999ff-l9nc5\" (UID: \"5433a844-6de1-46c7-9d5d-dca6144f2027\") " pod="cert-manager/cert-manager-79c8d999ff-l9nc5" Apr 23 09:35:20.083791 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.083752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr46t\" (UniqueName: \"kubernetes.io/projected/5433a844-6de1-46c7-9d5d-dca6144f2027-kube-api-access-jr46t\") pod \"cert-manager-79c8d999ff-l9nc5\" (UID: \"5433a844-6de1-46c7-9d5d-dca6144f2027\") " pod="cert-manager/cert-manager-79c8d999ff-l9nc5" Apr 23 09:35:20.084026 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.084003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5433a844-6de1-46c7-9d5d-dca6144f2027-bound-sa-token\") pod \"cert-manager-79c8d999ff-l9nc5\" (UID: \"5433a844-6de1-46c7-9d5d-dca6144f2027\") " pod="cert-manager/cert-manager-79c8d999ff-l9nc5" Apr 23 09:35:20.172875 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.172789 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-l9nc5" Apr 23 09:35:20.291078 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.291045 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-l9nc5"] Apr 23 09:35:20.294447 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:35:20.294419 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5433a844_6de1_46c7_9d5d_dca6144f2027.slice/crio-cd49ea64f68de63ebdd3a37021da803e2d8e0de15b80998cf46cea65832964b5 WatchSource:0}: Error finding container cd49ea64f68de63ebdd3a37021da803e2d8e0de15b80998cf46cea65832964b5: Status 404 returned error can't find the container with id cd49ea64f68de63ebdd3a37021da803e2d8e0de15b80998cf46cea65832964b5 Apr 23 09:35:20.358932 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.358903 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-l9nc5" event={"ID":"5433a844-6de1-46c7-9d5d-dca6144f2027","Type":"ContainerStarted","Data":"a9b99b6164d5ce356289054ba7b695ee5c49e03915ee0472034b65b0eca06fd5"} Apr 23 09:35:20.359138 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.358942 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-l9nc5" event={"ID":"5433a844-6de1-46c7-9d5d-dca6144f2027","Type":"ContainerStarted","Data":"cd49ea64f68de63ebdd3a37021da803e2d8e0de15b80998cf46cea65832964b5"} Apr 23 09:35:20.374944 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.374897 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-l9nc5" podStartSLOduration=1.374876844 podStartE2EDuration="1.374876844s" podCreationTimestamp="2026-04-23 09:35:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:35:20.373399562 +0000 UTC m=+278.528494530" watchObservedRunningTime="2026-04-23 09:35:20.374876844 +0000 UTC m=+278.529971813" Apr 23 09:35:20.722874 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.722844 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm"] Apr 23 09:35:20.726106 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.726088 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:20.729052 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.729030 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 09:35:20.729052 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.729059 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fwp7x\"" Apr 23 09:35:20.729235 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.729098 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 09:35:20.744311 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.744285 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm"] Apr 23 09:35:20.780897 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.780853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:20.780897 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.780905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f65gj\" (UniqueName: \"kubernetes.io/projected/0414c6f9-8405-48ea-8795-a9b97c0af90c-kube-api-access-f65gj\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:20.781059 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.780999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:20.881948 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.881918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:20.882260 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.881969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:20.882260 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.882092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f65gj\" (UniqueName: \"kubernetes.io/projected/0414c6f9-8405-48ea-8795-a9b97c0af90c-kube-api-access-f65gj\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:20.882260 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.882254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:20.882355 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.882331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:20.893408 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:20.893385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f65gj\" (UniqueName: \"kubernetes.io/projected/0414c6f9-8405-48ea-8795-a9b97c0af90c-kube-api-access-f65gj\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:21.035635 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:21.035607 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:21.156641 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:21.156614 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm"] Apr 23 09:35:21.158956 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:35:21.158929 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0414c6f9_8405_48ea_8795_a9b97c0af90c.slice/crio-a98a26cabb883d77259d37d430877964dde2d9049091052ea50a072d12f7e8f1 WatchSource:0}: Error finding container a98a26cabb883d77259d37d430877964dde2d9049091052ea50a072d12f7e8f1: Status 404 returned error can't find the container with id a98a26cabb883d77259d37d430877964dde2d9049091052ea50a072d12f7e8f1 Apr 23 09:35:21.363957 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:21.363876 2577 generic.go:358] "Generic (PLEG): container finished" podID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerID="89bd2e37416ca29e841bfa4fcb4d9bae8fd1cce79454d17eb4620cf44323d1a0" exitCode=0 Apr 23 09:35:21.363957 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:21.363934 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" event={"ID":"0414c6f9-8405-48ea-8795-a9b97c0af90c","Type":"ContainerDied","Data":"89bd2e37416ca29e841bfa4fcb4d9bae8fd1cce79454d17eb4620cf44323d1a0"} Apr 23 09:35:21.364136 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:21.363970 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" event={"ID":"0414c6f9-8405-48ea-8795-a9b97c0af90c","Type":"ContainerStarted","Data":"a98a26cabb883d77259d37d430877964dde2d9049091052ea50a072d12f7e8f1"} Apr 23 09:35:24.375862 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:24.375826 2577 generic.go:358] "Generic (PLEG): container finished" podID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerID="3789031c16f96745407eebf7f41df131532e353ecb89ee6ba3b8ba24ad6f862c" exitCode=0 Apr 23 09:35:24.376293 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:24.375888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" event={"ID":"0414c6f9-8405-48ea-8795-a9b97c0af90c","Type":"ContainerDied","Data":"3789031c16f96745407eebf7f41df131532e353ecb89ee6ba3b8ba24ad6f862c"} Apr 23 09:35:25.381538 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:25.381500 2577 generic.go:358] "Generic (PLEG): container finished" podID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerID="1b3f2b1978eb890e576736ec6b731d86794e10d14d75af4970672a4c3eee1cc7" exitCode=0 Apr 23 09:35:25.381930 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:25.381542 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" event={"ID":"0414c6f9-8405-48ea-8795-a9b97c0af90c","Type":"ContainerDied","Data":"1b3f2b1978eb890e576736ec6b731d86794e10d14d75af4970672a4c3eee1cc7"} Apr 23 09:35:26.502530 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.502503 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:26.630378 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.630338 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-util\") pod \"0414c6f9-8405-48ea-8795-a9b97c0af90c\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " Apr 23 09:35:26.630378 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.630388 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f65gj\" (UniqueName: \"kubernetes.io/projected/0414c6f9-8405-48ea-8795-a9b97c0af90c-kube-api-access-f65gj\") pod \"0414c6f9-8405-48ea-8795-a9b97c0af90c\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " Apr 23 09:35:26.630631 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.630432 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-bundle\") pod \"0414c6f9-8405-48ea-8795-a9b97c0af90c\" (UID: \"0414c6f9-8405-48ea-8795-a9b97c0af90c\") " Apr 23 09:35:26.630870 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.630848 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-bundle" (OuterVolumeSpecName: "bundle") pod "0414c6f9-8405-48ea-8795-a9b97c0af90c" (UID: "0414c6f9-8405-48ea-8795-a9b97c0af90c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:35:26.632792 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.632753 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0414c6f9-8405-48ea-8795-a9b97c0af90c-kube-api-access-f65gj" (OuterVolumeSpecName: "kube-api-access-f65gj") pod "0414c6f9-8405-48ea-8795-a9b97c0af90c" (UID: "0414c6f9-8405-48ea-8795-a9b97c0af90c"). InnerVolumeSpecName "kube-api-access-f65gj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:35:26.730975 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.730936 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f65gj\" (UniqueName: \"kubernetes.io/projected/0414c6f9-8405-48ea-8795-a9b97c0af90c-kube-api-access-f65gj\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:35:26.730975 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.730965 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-bundle\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:35:26.757658 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.757625 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-util" (OuterVolumeSpecName: "util") pod "0414c6f9-8405-48ea-8795-a9b97c0af90c" (UID: "0414c6f9-8405-48ea-8795-a9b97c0af90c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:35:26.832324 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:26.832291 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0414c6f9-8405-48ea-8795-a9b97c0af90c-util\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:35:27.390822 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:27.390765 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" event={"ID":"0414c6f9-8405-48ea-8795-a9b97c0af90c","Type":"ContainerDied","Data":"a98a26cabb883d77259d37d430877964dde2d9049091052ea50a072d12f7e8f1"} Apr 23 09:35:27.390822 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:27.390821 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78elgrxm" Apr 23 09:35:27.391030 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:27.390823 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a98a26cabb883d77259d37d430877964dde2d9049091052ea50a072d12f7e8f1" Apr 23 09:35:42.319347 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:42.319315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:35:42.319829 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:42.319814 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:35:42.325901 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:42.325883 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 09:35:43.496631 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.496593 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv"] Apr 23 09:35:43.497860 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.496909 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerName="extract" Apr 23 09:35:43.497860 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.496920 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerName="extract" Apr 23 09:35:43.497860 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.496933 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerName="util" Apr 23 09:35:43.497860 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.496939 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerName="util" Apr 23 09:35:43.497860 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.496952 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerName="pull" Apr 23 09:35:43.497860 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.496957 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerName="pull" Apr 23 09:35:43.497860 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.497007 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0414c6f9-8405-48ea-8795-a9b97c0af90c" containerName="extract" Apr 23 09:35:43.498570 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.498555 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.502108 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.502086 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 23 09:35:43.502234 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.502202 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 23 09:35:43.502952 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.502935 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:35:43.503021 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.503000 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 23 09:35:43.503069 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.503018 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 23 09:35:43.503119 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.503069 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-44h7j\"" Apr 23 09:35:43.519456 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.519428 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv"] Apr 23 09:35:43.677985 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.677944 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fa872faf-6ad8-4ad4-9501-890af666b892-manager-config\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.678170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.677989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkp29\" (UniqueName: \"kubernetes.io/projected/fa872faf-6ad8-4ad4-9501-890af666b892-kube-api-access-wkp29\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.678170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.678050 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa872faf-6ad8-4ad4-9501-890af666b892-metrics-certs\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.678170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.678116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa872faf-6ad8-4ad4-9501-890af666b892-cert\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.778796 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.778673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa872faf-6ad8-4ad4-9501-890af666b892-cert\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.778796 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.778764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fa872faf-6ad8-4ad4-9501-890af666b892-manager-config\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.778796 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.778805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkp29\" (UniqueName: \"kubernetes.io/projected/fa872faf-6ad8-4ad4-9501-890af666b892-kube-api-access-wkp29\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.779055 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.778827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa872faf-6ad8-4ad4-9501-890af666b892-metrics-certs\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.779478 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.779455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fa872faf-6ad8-4ad4-9501-890af666b892-manager-config\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.781339 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.781315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa872faf-6ad8-4ad4-9501-890af666b892-metrics-certs\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.781449 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.781356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa872faf-6ad8-4ad4-9501-890af666b892-cert\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.786370 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.786348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkp29\" (UniqueName: \"kubernetes.io/projected/fa872faf-6ad8-4ad4-9501-890af666b892-kube-api-access-wkp29\") pod \"jobset-controller-manager-7bd6579f68-55rvv\" (UID: \"fa872faf-6ad8-4ad4-9501-890af666b892\") " pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.808429 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.808404 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:43.931827 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.931689 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv"] Apr 23 09:35:43.933982 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:35:43.933951 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa872faf_6ad8_4ad4_9501_890af666b892.slice/crio-0f467fe4d2c348c6905443fb87f6d8608b122a27b03f4b0590cc1d1f0ff4cf35 WatchSource:0}: Error finding container 0f467fe4d2c348c6905443fb87f6d8608b122a27b03f4b0590cc1d1f0ff4cf35: Status 404 returned error can't find the container with id 0f467fe4d2c348c6905443fb87f6d8608b122a27b03f4b0590cc1d1f0ff4cf35 Apr 23 09:35:43.935917 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:43.935898 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:35:44.441458 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:44.441424 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" event={"ID":"fa872faf-6ad8-4ad4-9501-890af666b892","Type":"ContainerStarted","Data":"0f467fe4d2c348c6905443fb87f6d8608b122a27b03f4b0590cc1d1f0ff4cf35"} Apr 23 09:35:47.453852 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:47.453811 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" event={"ID":"fa872faf-6ad8-4ad4-9501-890af666b892","Type":"ContainerStarted","Data":"7e537403c9b159d76ca2de07afe16925a68190b0a30362f7e97fd322c6f52a3d"} Apr 23 09:35:47.454296 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:47.453935 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:35:47.468941 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:47.468890 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" podStartSLOduration=1.70456505 podStartE2EDuration="4.468876612s" podCreationTimestamp="2026-04-23 09:35:43 +0000 UTC" firstStartedPulling="2026-04-23 09:35:43.936055323 +0000 UTC m=+302.091150270" lastFinishedPulling="2026-04-23 09:35:46.700366885 +0000 UTC m=+304.855461832" observedRunningTime="2026-04-23 09:35:47.468296654 +0000 UTC m=+305.623391625" watchObservedRunningTime="2026-04-23 09:35:47.468876612 +0000 UTC m=+305.623971581" Apr 23 09:35:58.462150 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:35:58.462120 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-7bd6579f68-55rvv" Apr 23 09:37:44.920526 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:44.920492 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c67fd88bb-p2hzp"] Apr 23 09:37:44.923648 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:44.923631 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:44.934386 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:44.934361 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c67fd88bb-p2hzp"] Apr 23 09:37:45.054386 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.054348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed0821f1-5383-4767-9bab-504e4543445f-console-oauth-config\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.054386 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.054385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-service-ca\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.054616 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.054423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0821f1-5383-4767-9bab-504e4543445f-console-serving-cert\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.054616 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.054476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qn6p\" (UniqueName: \"kubernetes.io/projected/ed0821f1-5383-4767-9bab-504e4543445f-kube-api-access-7qn6p\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.054616 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.054510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-oauth-serving-cert\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.054616 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.054549 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-trusted-ca-bundle\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.054616 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.054568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-console-config\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.155506 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.155476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed0821f1-5383-4767-9bab-504e4543445f-console-oauth-config\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.155506 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.155510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-service-ca\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.155752 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.155549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0821f1-5383-4767-9bab-504e4543445f-console-serving-cert\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.155752 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.155578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qn6p\" (UniqueName: \"kubernetes.io/projected/ed0821f1-5383-4767-9bab-504e4543445f-kube-api-access-7qn6p\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.155752 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.155602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-oauth-serving-cert\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.155752 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.155635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-trusted-ca-bundle\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.155752 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.155664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-console-config\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.156341 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.156316 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-service-ca\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.156446 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.156316 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-oauth-serving-cert\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.156490 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.156471 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-trusted-ca-bundle\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.156564 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.156544 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed0821f1-5383-4767-9bab-504e4543445f-console-config\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.158201 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.158174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0821f1-5383-4767-9bab-504e4543445f-console-serving-cert\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.158277 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.158222 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed0821f1-5383-4767-9bab-504e4543445f-console-oauth-config\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.164833 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.164813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qn6p\" (UniqueName: \"kubernetes.io/projected/ed0821f1-5383-4767-9bab-504e4543445f-kube-api-access-7qn6p\") pod \"console-c67fd88bb-p2hzp\" (UID: \"ed0821f1-5383-4767-9bab-504e4543445f\") " pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.232930 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.232858 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:45.352803 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.352746 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c67fd88bb-p2hzp"] Apr 23 09:37:45.355576 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:37:45.355549 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded0821f1_5383_4767_9bab_504e4543445f.slice/crio-9abb0175f2439912cf8e967a4d2fbfff390f804fc4c5cab8408162d79af40756 WatchSource:0}: Error finding container 9abb0175f2439912cf8e967a4d2fbfff390f804fc4c5cab8408162d79af40756: Status 404 returned error can't find the container with id 9abb0175f2439912cf8e967a4d2fbfff390f804fc4c5cab8408162d79af40756 Apr 23 09:37:45.824044 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.823999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c67fd88bb-p2hzp" event={"ID":"ed0821f1-5383-4767-9bab-504e4543445f","Type":"ContainerStarted","Data":"6b6f9e1649eaffe120d1596ba53c60a61e26bbff2a10456b6cfc42a27b37e460"} Apr 23 09:37:45.824044 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.824043 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c67fd88bb-p2hzp" event={"ID":"ed0821f1-5383-4767-9bab-504e4543445f","Type":"ContainerStarted","Data":"9abb0175f2439912cf8e967a4d2fbfff390f804fc4c5cab8408162d79af40756"} Apr 23 09:37:45.841220 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:45.841166 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c67fd88bb-p2hzp" podStartSLOduration=1.841151822 podStartE2EDuration="1.841151822s" podCreationTimestamp="2026-04-23 09:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:37:45.839516451 +0000 UTC m=+423.994611420" watchObservedRunningTime="2026-04-23 09:37:45.841151822 +0000 UTC m=+423.996246784" Apr 23 09:37:55.233528 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:55.233434 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:55.233528 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:55.233482 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:55.239617 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:55.239577 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:55.860905 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:55.860872 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c67fd88bb-p2hzp" Apr 23 09:37:55.903742 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:37:55.903715 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b4696bc44-n22hx"] Apr 23 09:38:20.924950 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:20.924910 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b4696bc44-n22hx" podUID="c5fa65b7-aa69-4190-842b-296102192184" containerName="console" containerID="cri-o://b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707" gracePeriod=15 Apr 23 09:38:21.165682 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.165659 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b4696bc44-n22hx_c5fa65b7-aa69-4190-842b-296102192184/console/0.log" Apr 23 09:38:21.165815 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.165718 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:38:21.258009 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.257980 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-trusted-ca-bundle\") pod \"c5fa65b7-aa69-4190-842b-296102192184\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " Apr 23 09:38:21.258170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258034 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-oauth-serving-cert\") pod \"c5fa65b7-aa69-4190-842b-296102192184\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " Apr 23 09:38:21.258170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258062 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-service-ca\") pod \"c5fa65b7-aa69-4190-842b-296102192184\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " Apr 23 09:38:21.258170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258112 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-console-config\") pod \"c5fa65b7-aa69-4190-842b-296102192184\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " Apr 23 09:38:21.258170 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258157 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-oauth-config\") pod \"c5fa65b7-aa69-4190-842b-296102192184\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " Apr 23 09:38:21.258343 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258191 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-serving-cert\") pod \"c5fa65b7-aa69-4190-842b-296102192184\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " Apr 23 09:38:21.258343 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258226 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh22b\" (UniqueName: \"kubernetes.io/projected/c5fa65b7-aa69-4190-842b-296102192184-kube-api-access-jh22b\") pod \"c5fa65b7-aa69-4190-842b-296102192184\" (UID: \"c5fa65b7-aa69-4190-842b-296102192184\") " Apr 23 09:38:21.258605 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258580 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c5fa65b7-aa69-4190-842b-296102192184" (UID: "c5fa65b7-aa69-4190-842b-296102192184"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:38:21.258605 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258590 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c5fa65b7-aa69-4190-842b-296102192184" (UID: "c5fa65b7-aa69-4190-842b-296102192184"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:38:21.258724 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258603 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-service-ca" (OuterVolumeSpecName: "service-ca") pod "c5fa65b7-aa69-4190-842b-296102192184" (UID: "c5fa65b7-aa69-4190-842b-296102192184"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:38:21.258841 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.258819 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-console-config" (OuterVolumeSpecName: "console-config") pod "c5fa65b7-aa69-4190-842b-296102192184" (UID: "c5fa65b7-aa69-4190-842b-296102192184"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:38:21.260573 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.260540 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c5fa65b7-aa69-4190-842b-296102192184" (UID: "c5fa65b7-aa69-4190-842b-296102192184"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:38:21.260671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.260580 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c5fa65b7-aa69-4190-842b-296102192184" (UID: "c5fa65b7-aa69-4190-842b-296102192184"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:38:21.260671 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.260593 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5fa65b7-aa69-4190-842b-296102192184-kube-api-access-jh22b" (OuterVolumeSpecName: "kube-api-access-jh22b") pod "c5fa65b7-aa69-4190-842b-296102192184" (UID: "c5fa65b7-aa69-4190-842b-296102192184"). InnerVolumeSpecName "kube-api-access-jh22b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:38:21.359639 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.359604 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-trusted-ca-bundle\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:38:21.359639 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.359633 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-oauth-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:38:21.359639 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.359644 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-service-ca\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:38:21.359942 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.359653 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fa65b7-aa69-4190-842b-296102192184-console-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:38:21.359942 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.359664 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-oauth-config\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:38:21.359942 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.359673 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fa65b7-aa69-4190-842b-296102192184-console-serving-cert\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:38:21.359942 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.359682 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jh22b\" (UniqueName: \"kubernetes.io/projected/c5fa65b7-aa69-4190-842b-296102192184-kube-api-access-jh22b\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:38:21.939310 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.939282 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b4696bc44-n22hx_c5fa65b7-aa69-4190-842b-296102192184/console/0.log" Apr 23 09:38:21.939708 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.939324 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5fa65b7-aa69-4190-842b-296102192184" containerID="b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707" exitCode=2 Apr 23 09:38:21.939708 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.939396 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b4696bc44-n22hx" Apr 23 09:38:21.939708 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.939395 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b4696bc44-n22hx" event={"ID":"c5fa65b7-aa69-4190-842b-296102192184","Type":"ContainerDied","Data":"b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707"} Apr 23 09:38:21.939708 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.939507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b4696bc44-n22hx" event={"ID":"c5fa65b7-aa69-4190-842b-296102192184","Type":"ContainerDied","Data":"fbc0d883768ae4fe31219cc6e17b9ef37140698680d32c7b50ef0c050670d137"} Apr 23 09:38:21.939708 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.939526 2577 scope.go:117] "RemoveContainer" containerID="b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707" Apr 23 09:38:21.947833 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.947812 2577 scope.go:117] "RemoveContainer" containerID="b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707" Apr 23 09:38:21.948066 ip-10-0-140-162 kubenswrapper[2577]: E0423 09:38:21.948047 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707\": container with ID starting with b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707 not found: ID does not exist" containerID="b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707" Apr 23 09:38:21.948111 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.948075 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707"} err="failed to get container status \"b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707\": rpc error: code = NotFound desc = could not find container \"b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707\": container with ID starting with b73731e06698bc7e6de8b0df20c3a5e519ce240164cdf735ca733754f5f18707 not found: ID does not exist" Apr 23 09:38:21.960169 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.960140 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b4696bc44-n22hx"] Apr 23 09:38:21.965837 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:21.965808 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b4696bc44-n22hx"] Apr 23 09:38:22.408932 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:38:22.408899 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5fa65b7-aa69-4190-842b-296102192184" path="/var/lib/kubelet/pods/c5fa65b7-aa69-4190-842b-296102192184/volumes" Apr 23 09:40:42.341207 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:40:42.341171 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:40:42.342080 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:40:42.342056 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:41:25.035280 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.035246 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7"] Apr 23 09:41:25.037549 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.035554 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5fa65b7-aa69-4190-842b-296102192184" containerName="console" Apr 23 09:41:25.037549 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.035565 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fa65b7-aa69-4190-842b-296102192184" containerName="console" Apr 23 09:41:25.037549 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.035615 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5fa65b7-aa69-4190-842b-296102192184" containerName="console" Apr 23 09:41:25.038360 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.038345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" Apr 23 09:41:25.040818 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.040769 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"kube-root-ca.crt\"" Apr 23 09:41:25.041047 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.041019 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"openshift-service-ca.crt\"" Apr 23 09:41:25.041047 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.041033 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"default-dockercfg-zg6nr\"" Apr 23 09:41:25.047109 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.047075 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7"] Apr 23 09:41:25.162035 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.161990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82gq7\" (UniqueName: \"kubernetes.io/projected/fd82f73d-230d-4720-abf1-b6c9216e38e1-kube-api-access-82gq7\") pod \"progression-job-failure-node-0-0-4nsp7\" (UID: \"fd82f73d-230d-4720-abf1-b6c9216e38e1\") " pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" Apr 23 09:41:25.263493 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.263453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82gq7\" (UniqueName: \"kubernetes.io/projected/fd82f73d-230d-4720-abf1-b6c9216e38e1-kube-api-access-82gq7\") pod \"progression-job-failure-node-0-0-4nsp7\" (UID: \"fd82f73d-230d-4720-abf1-b6c9216e38e1\") " pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" Apr 23 09:41:25.271588 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.271558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82gq7\" (UniqueName: \"kubernetes.io/projected/fd82f73d-230d-4720-abf1-b6c9216e38e1-kube-api-access-82gq7\") pod \"progression-job-failure-node-0-0-4nsp7\" (UID: \"fd82f73d-230d-4720-abf1-b6c9216e38e1\") " pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" Apr 23 09:41:25.348213 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.348125 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" Apr 23 09:41:25.470562 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.470472 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7"] Apr 23 09:41:25.473403 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:41:25.473377 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd82f73d_230d_4720_abf1_b6c9216e38e1.slice/crio-8f45bcb87f408d32aef47a97f4b8725fabcf6f2ccdbf81d1d90bf744d641fed1 WatchSource:0}: Error finding container 8f45bcb87f408d32aef47a97f4b8725fabcf6f2ccdbf81d1d90bf744d641fed1: Status 404 returned error can't find the container with id 8f45bcb87f408d32aef47a97f4b8725fabcf6f2ccdbf81d1d90bf744d641fed1 Apr 23 09:41:25.475805 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.475789 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:41:25.524983 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:41:25.524953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" event={"ID":"fd82f73d-230d-4720-abf1-b6c9216e38e1","Type":"ContainerStarted","Data":"8f45bcb87f408d32aef47a97f4b8725fabcf6f2ccdbf81d1d90bf744d641fed1"} Apr 23 09:43:11.908631 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:11.908585 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" event={"ID":"fd82f73d-230d-4720-abf1-b6c9216e38e1","Type":"ContainerStarted","Data":"9200c2b53adfba3f6a6368ed5aaf8c4c73b673410a3d6535f33a68affd00ecf8"} Apr 23 09:43:11.909049 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:11.908689 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" Apr 23 09:43:11.926659 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:11.926606 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" podStartSLOduration=2.600325673 podStartE2EDuration="1m47.92659152s" podCreationTimestamp="2026-04-23 09:41:24 +0000 UTC" firstStartedPulling="2026-04-23 09:41:25.475914566 +0000 UTC m=+643.631009513" lastFinishedPulling="2026-04-23 09:43:10.802180412 +0000 UTC m=+748.957275360" observedRunningTime="2026-04-23 09:43:11.924559745 +0000 UTC m=+750.079654714" watchObservedRunningTime="2026-04-23 09:43:11.92659152 +0000 UTC m=+750.081686489" Apr 23 09:43:12.911635 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:12.911605 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" Apr 23 09:43:20.909321 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:20.909271 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" podUID="fd82f73d-230d-4720-abf1-b6c9216e38e1" containerName="node" probeResult="failure" output="Get \"http://10.133.0.29:28080/metrics\": dial tcp 10.133.0.29:28080: connect: connection refused" Apr 23 09:43:20.937621 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:20.937591 2577 generic.go:358] "Generic (PLEG): container finished" podID="fd82f73d-230d-4720-abf1-b6c9216e38e1" containerID="9200c2b53adfba3f6a6368ed5aaf8c4c73b673410a3d6535f33a68affd00ecf8" exitCode=1 Apr 23 09:43:20.937760 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:20.937662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" event={"ID":"fd82f73d-230d-4720-abf1-b6c9216e38e1","Type":"ContainerDied","Data":"9200c2b53adfba3f6a6368ed5aaf8c4c73b673410a3d6535f33a68affd00ecf8"} Apr 23 09:43:22.061220 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:22.061199 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" Apr 23 09:43:22.108046 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:22.108018 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82gq7\" (UniqueName: \"kubernetes.io/projected/fd82f73d-230d-4720-abf1-b6c9216e38e1-kube-api-access-82gq7\") pod \"fd82f73d-230d-4720-abf1-b6c9216e38e1\" (UID: \"fd82f73d-230d-4720-abf1-b6c9216e38e1\") " Apr 23 09:43:22.110337 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:22.110311 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd82f73d-230d-4720-abf1-b6c9216e38e1-kube-api-access-82gq7" (OuterVolumeSpecName: "kube-api-access-82gq7") pod "fd82f73d-230d-4720-abf1-b6c9216e38e1" (UID: "fd82f73d-230d-4720-abf1-b6c9216e38e1"). InnerVolumeSpecName "kube-api-access-82gq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:43:22.209599 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:22.209518 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82gq7\" (UniqueName: \"kubernetes.io/projected/fd82f73d-230d-4720-abf1-b6c9216e38e1-kube-api-access-82gq7\") on node \"ip-10-0-140-162.ec2.internal\" DevicePath \"\"" Apr 23 09:43:22.945686 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:22.945642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" event={"ID":"fd82f73d-230d-4720-abf1-b6c9216e38e1","Type":"ContainerDied","Data":"8f45bcb87f408d32aef47a97f4b8725fabcf6f2ccdbf81d1d90bf744d641fed1"} Apr 23 09:43:22.945686 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:22.945687 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f45bcb87f408d32aef47a97f4b8725fabcf6f2ccdbf81d1d90bf744d641fed1" Apr 23 09:43:22.945686 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:22.945689 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7" Apr 23 09:43:42.280605 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:42.280570 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7"] Apr 23 09:43:42.282873 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:42.282850 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-job-failure-node-0-0-4nsp7"] Apr 23 09:43:42.408419 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:43:42.408387 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd82f73d-230d-4720-abf1-b6c9216e38e1" path="/var/lib/kubelet/pods/fd82f73d-230d-4720-abf1-b6c9216e38e1/volumes" Apr 23 09:44:34.305968 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:34.305937 2577 ???:1] "http: TLS handshake error from 10.0.136.17:52040: EOF" Apr 23 09:44:34.309695 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:34.309645 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6mvdm_7cbf79e1-2f10-446e-b5c9-8905a2c257a9/global-pull-secret-syncer/0.log" Apr 23 09:44:34.387211 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:34.387177 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gkkhl_8cc7fa9d-ac09-438d-a8a9-392735fe315f/konnectivity-agent/0.log" Apr 23 09:44:34.478233 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:34.478204 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-162.ec2.internal_d85ba369ef6b926856de0defa6be0a9e/haproxy/0.log" Apr 23 09:44:37.323116 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.323081 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f9deca1-8e6e-4a72-8f05-18ee01034792/alertmanager/0.log" Apr 23 09:44:37.347627 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.347599 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f9deca1-8e6e-4a72-8f05-18ee01034792/config-reloader/0.log" Apr 23 09:44:37.372875 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.372848 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f9deca1-8e6e-4a72-8f05-18ee01034792/kube-rbac-proxy-web/0.log" Apr 23 09:44:37.404400 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.404372 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f9deca1-8e6e-4a72-8f05-18ee01034792/kube-rbac-proxy/0.log" Apr 23 09:44:37.432247 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.432218 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f9deca1-8e6e-4a72-8f05-18ee01034792/kube-rbac-proxy-metric/0.log" Apr 23 09:44:37.456574 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.456553 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f9deca1-8e6e-4a72-8f05-18ee01034792/prom-label-proxy/0.log" Apr 23 09:44:37.478067 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.478044 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f9deca1-8e6e-4a72-8f05-18ee01034792/init-config-reloader/0.log" Apr 23 09:44:37.620081 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.619995 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-55c9c986d8-bsgq5_3dd24c38-657c-425c-a27a-09e527b26e51/metrics-server/0.log" Apr 23 09:44:37.674225 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.674192 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-88zx9_31d81257-086f-4891-af00-db82e4feba37/node-exporter/0.log" Apr 23 09:44:37.695390 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.695359 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-88zx9_31d81257-086f-4891-af00-db82e4feba37/kube-rbac-proxy/0.log" Apr 23 09:44:37.714125 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:37.714108 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-88zx9_31d81257-086f-4891-af00-db82e4feba37/init-textfile/0.log" Apr 23 09:44:38.146975 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:38.146916 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gcq96_60052d67-8854-44ac-9a6b-d9c4acc92c52/prometheus-operator/0.log" Apr 23 09:44:38.164907 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:38.164883 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gcq96_60052d67-8854-44ac-9a6b-d9c4acc92c52/kube-rbac-proxy/0.log" Apr 23 09:44:38.190628 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:38.190602 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-wvdnl_f5d15c32-85eb-45ba-8ab7-d34f9c152d7c/prometheus-operator-admission-webhook/0.log" Apr 23 09:44:38.216435 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:38.216409 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-695f97b69f-4zff8_0a099632-59e6-471e-9564-387d2c572c9f/telemeter-client/0.log" Apr 23 09:44:38.237659 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:38.237617 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-695f97b69f-4zff8_0a099632-59e6-471e-9564-387d2c572c9f/reload/0.log" Apr 23 09:44:38.258173 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:38.258149 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-695f97b69f-4zff8_0a099632-59e6-471e-9564-387d2c572c9f/kube-rbac-proxy/0.log" Apr 23 09:44:40.205548 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:40.205506 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c67fd88bb-p2hzp_ed0821f1-5383-4767-9bab-504e4543445f/console/0.log" Apr 23 09:44:41.113138 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.113109 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v"] Apr 23 09:44:41.113402 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.113391 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd82f73d-230d-4720-abf1-b6c9216e38e1" containerName="node" Apr 23 09:44:41.113457 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.113404 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd82f73d-230d-4720-abf1-b6c9216e38e1" containerName="node" Apr 23 09:44:41.113490 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.113463 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd82f73d-230d-4720-abf1-b6c9216e38e1" containerName="node" Apr 23 09:44:41.116002 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.115986 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.118266 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.118244 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mpzrx\"/\"openshift-service-ca.crt\"" Apr 23 09:44:41.118380 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.118253 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mpzrx\"/\"kube-root-ca.crt\"" Apr 23 09:44:41.119070 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.119055 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mpzrx\"/\"default-dockercfg-xkkx2\"" Apr 23 09:44:41.126818 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.126796 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v"] Apr 23 09:44:41.242288 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.242250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-sys\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.242288 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.242294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-podres\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.242718 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.242319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-lib-modules\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.242718 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.242353 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66bws\" (UniqueName: \"kubernetes.io/projected/1cb898fe-82bf-4009-b47e-fdc503608af8-kube-api-access-66bws\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.242718 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.242380 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-proc\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.248572 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.248534 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pw8pn_d38301ed-fb73-4e46-aa00-455f456c3ca3/dns/0.log" Apr 23 09:44:41.266763 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.266743 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pw8pn_d38301ed-fb73-4e46-aa00-455f456c3ca3/kube-rbac-proxy/0.log" Apr 23 09:44:41.327527 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.327502 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-82v2z_4ad34935-1d25-4cee-bc83-179a47166c9e/dns-node-resolver/0.log" Apr 23 09:44:41.343706 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.343677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-sys\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.343853 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.343719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-podres\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.343853 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.343756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-lib-modules\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.343853 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.343811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-sys\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.343853 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.343814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66bws\" (UniqueName: \"kubernetes.io/projected/1cb898fe-82bf-4009-b47e-fdc503608af8-kube-api-access-66bws\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.344017 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.343882 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-proc\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.344017 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.343926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-podres\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.344017 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.343929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-lib-modules\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.344017 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.343965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1cb898fe-82bf-4009-b47e-fdc503608af8-proc\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.351195 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.351171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66bws\" (UniqueName: \"kubernetes.io/projected/1cb898fe-82bf-4009-b47e-fdc503608af8-kube-api-access-66bws\") pod \"perf-node-gather-daemonset-5rm8v\" (UID: \"1cb898fe-82bf-4009-b47e-fdc503608af8\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.426496 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.426419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:41.545676 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.545598 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v"] Apr 23 09:44:41.547857 ip-10-0-140-162 kubenswrapper[2577]: W0423 09:44:41.547821 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1cb898fe_82bf_4009_b47e_fdc503608af8.slice/crio-6b81d313e98edec544394070ee49df2b961f0ded10e20a572e1f1ddf43e2342f WatchSource:0}: Error finding container 6b81d313e98edec544394070ee49df2b961f0ded10e20a572e1f1ddf43e2342f: Status 404 returned error can't find the container with id 6b81d313e98edec544394070ee49df2b961f0ded10e20a572e1f1ddf43e2342f Apr 23 09:44:41.758354 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:41.758324 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bcm2p_9caa8b02-3e8a-47d0-9956-0368d378aa04/node-ca/0.log" Apr 23 09:44:42.205276 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:42.205190 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" event={"ID":"1cb898fe-82bf-4009-b47e-fdc503608af8","Type":"ContainerStarted","Data":"3a68c776de92b55eab828edc58104a305f5f2247946566d57410db518e2e0f9e"} Apr 23 09:44:42.205276 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:42.205226 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" event={"ID":"1cb898fe-82bf-4009-b47e-fdc503608af8","Type":"ContainerStarted","Data":"6b81d313e98edec544394070ee49df2b961f0ded10e20a572e1f1ddf43e2342f"} Apr 23 09:44:42.205276 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:42.205253 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:42.220794 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:42.220439 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" podStartSLOduration=1.2204103370000001 podStartE2EDuration="1.220410337s" podCreationTimestamp="2026-04-23 09:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:44:42.219758468 +0000 UTC m=+840.374853436" watchObservedRunningTime="2026-04-23 09:44:42.220410337 +0000 UTC m=+840.375505370" Apr 23 09:44:42.754075 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:42.754039 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-scc48_ffda6afc-6275-4e59-9ea6-12d458a70e5e/serve-healthcheck-canary/0.log" Apr 23 09:44:43.226018 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:43.225941 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rfmg9_b8da361e-89a3-41c1-9d56-1a4f1d0e6798/kube-rbac-proxy/0.log" Apr 23 09:44:43.245283 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:43.245258 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rfmg9_b8da361e-89a3-41c1-9d56-1a4f1d0e6798/exporter/0.log" Apr 23 09:44:43.265967 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:43.265943 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rfmg9_b8da361e-89a3-41c1-9d56-1a4f1d0e6798/extractor/0.log" Apr 23 09:44:44.805979 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:44.805942 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-7bd6579f68-55rvv_fa872faf-6ad8-4ad4-9501-890af666b892/manager/0.log" Apr 23 09:44:48.217874 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:48.217846 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-5rm8v" Apr 23 09:44:48.773636 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:48.773602 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hktck_9617f384-f79c-4e0b-86a8-23ef327b0e85/kube-multus-additional-cni-plugins/0.log" Apr 23 09:44:48.793655 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:48.793630 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hktck_9617f384-f79c-4e0b-86a8-23ef327b0e85/egress-router-binary-copy/0.log" Apr 23 09:44:48.812493 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:48.812472 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hktck_9617f384-f79c-4e0b-86a8-23ef327b0e85/cni-plugins/0.log" Apr 23 09:44:48.830709 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:48.830685 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hktck_9617f384-f79c-4e0b-86a8-23ef327b0e85/bond-cni-plugin/0.log" Apr 23 09:44:48.848575 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:48.848557 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hktck_9617f384-f79c-4e0b-86a8-23ef327b0e85/routeoverride-cni/0.log" Apr 23 09:44:48.869624 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:48.869602 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hktck_9617f384-f79c-4e0b-86a8-23ef327b0e85/whereabouts-cni-bincopy/0.log" Apr 23 09:44:48.887603 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:48.887586 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hktck_9617f384-f79c-4e0b-86a8-23ef327b0e85/whereabouts-cni/0.log" Apr 23 09:44:49.304883 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:49.304852 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-stttf_be1c69a7-597d-45d8-8fb3-502373eacdd2/kube-multus/0.log" Apr 23 09:44:49.411110 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:49.411080 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zh888_5b48044a-70f3-4dd3-b5c5-d35b911078ea/network-metrics-daemon/0.log" Apr 23 09:44:49.428359 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:49.428331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zh888_5b48044a-70f3-4dd3-b5c5-d35b911078ea/kube-rbac-proxy/0.log" Apr 23 09:44:50.231418 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:50.231330 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-controller/0.log" Apr 23 09:44:50.251926 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:50.251871 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/0.log" Apr 23 09:44:50.258943 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:50.258920 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovn-acl-logging/1.log" Apr 23 09:44:50.275362 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:50.275323 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/kube-rbac-proxy-node/0.log" Apr 23 09:44:50.297754 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:50.297718 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:44:50.315276 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:50.315250 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/northd/0.log" Apr 23 09:44:50.334084 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:50.334053 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/nbdb/0.log" Apr 23 09:44:50.355845 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:50.355820 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/sbdb/0.log" Apr 23 09:44:50.515180 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:50.515153 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-54r9n_72da84fe-89a1-4d25-9ab3-667e195d5b53/ovnkube-controller/0.log" Apr 23 09:44:52.123069 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:52.122951 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jrn69_a3241665-735b-450b-a779-0fc671dcc4d0/network-check-target-container/0.log" Apr 23 09:44:52.991117 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:52.991087 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lhfct_5b5b1ef2-e4bf-4e9d-99fd-2cf856085144/iptables-alerter/0.log" Apr 23 09:44:53.590918 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:53.590894 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-v6fd9_f15f26f5-0cd1-4c82-8351-806c77b25958/tuned/0.log" Apr 23 09:44:56.345536 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:56.345492 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-2fqtf_75cd7467-d6ae-4c15-b25f-58105fb85a1b/service-ca-controller/0.log" Apr 23 09:44:56.667012 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:56.666935 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-fpwfr_53dfc537-d64d-46ce-8310-4e177cebb95f/csi-driver/0.log" Apr 23 09:44:56.700464 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:56.700411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-fpwfr_53dfc537-d64d-46ce-8310-4e177cebb95f/csi-node-driver-registrar/0.log" Apr 23 09:44:56.717886 ip-10-0-140-162 kubenswrapper[2577]: I0423 09:44:56.717856 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-fpwfr_53dfc537-d64d-46ce-8310-4e177cebb95f/csi-liveness-probe/0.log"