Apr 24 16:36:12.502260 ip-10-0-137-179 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 16:36:12.502271 ip-10-0-137-179 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 16:36:12.502278 ip-10-0-137-179 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 16:36:12.502418 ip-10-0-137-179 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 16:36:22.567062 ip-10-0-137-179 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 16:36:22.567080 ip-10-0-137-179 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0b9647f077784a26b344e49a9b0f60c0 -- Apr 24 16:38:49.653399 ip-10-0-137-179 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:38:50.129197 ip-10-0-137-179 kubenswrapper[2559]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:50.129197 ip-10-0-137-179 kubenswrapper[2559]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:38:50.129197 ip-10-0-137-179 kubenswrapper[2559]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:50.129197 ip-10-0-137-179 kubenswrapper[2559]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:38:50.129197 ip-10-0-137-179 kubenswrapper[2559]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:50.132087 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.131978 2559 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:38:50.136034 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136017 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:50.136034 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136034 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136038 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136041 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136044 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136047 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136050 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136054 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136056 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136059 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136062 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136064 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136067 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136074 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136090 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136093 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136096 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136099 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136102 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136104 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:50.136116 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136107 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136112 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136116 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136119 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136122 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136126 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136129 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136132 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136134 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136137 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136139 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136142 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136145 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136147 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136150 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136153 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136155 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136158 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136160 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:50.136568 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136163 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136165 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136168 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136171 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136173 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136176 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136178 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136182 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136185 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136187 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136190 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136192 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136195 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136198 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136201 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136204 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136207 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136210 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136212 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136215 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:50.137268 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136217 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136220 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136223 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136225 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136228 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136230 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136233 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136235 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136239 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136244 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136248 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136252 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136255 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136258 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136260 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136263 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136266 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136268 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136272 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136274 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:50.137976 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136278 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136281 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136284 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136286 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136289 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136291 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.136295 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137365 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137381 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137386 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137391 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137396 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137404 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137412 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137423 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137429 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137435 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137440 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137445 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137449 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:50.138509 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137454 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137458 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137462 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137467 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137471 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137476 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137485 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137496 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137501 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137506 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137510 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137515 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137520 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137524 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137528 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137533 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137537 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137542 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137546 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:50.138998 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137555 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137560 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137565 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137569 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137574 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137578 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137583 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137587 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137591 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137595 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137599 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137603 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137607 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137616 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137620 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137624 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137627 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137631 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137636 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137642 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:50.139488 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137647 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137654 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137659 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137664 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137668 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137678 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137682 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137686 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137690 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137694 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137698 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137703 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137795 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137818 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137823 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137828 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137833 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137839 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137844 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137848 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:50.139973 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137857 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137871 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137876 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137886 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137890 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137895 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137920 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.137926 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138095 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138101 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138106 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138109 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138113 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138117 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138217 2559 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138228 2559 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138237 2559 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138244 2559 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138255 2559 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138260 2559 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138266 2559 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:38:50.140586 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138274 2559 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138277 2559 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138281 2559 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138284 2559 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138289 2559 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138293 2559 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138296 2559 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138299 2559 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138302 2559 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138305 2559 flags.go:64] FLAG: --cloud-config="" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138308 2559 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138311 2559 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138316 2559 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138319 2559 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138322 2559 flags.go:64] FLAG: --config-dir="" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138325 2559 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138329 2559 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138332 2559 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138336 2559 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138339 2559 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138342 2559 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138345 2559 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138348 2559 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138351 2559 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138354 2559 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:38:50.141110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138357 2559 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138362 2559 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138365 2559 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138367 2559 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138371 2559 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138374 2559 flags.go:64] FLAG: --enable-server="true" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138377 2559 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138383 2559 flags.go:64] FLAG: --event-burst="100" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138386 2559 flags.go:64] FLAG: --event-qps="50" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138389 2559 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138393 2559 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138396 2559 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138400 2559 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138403 2559 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138407 2559 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138410 2559 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138413 2559 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138416 2559 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138419 2559 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138422 2559 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138425 2559 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138428 2559 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138430 2559 flags.go:64] FLAG: --feature-gates="" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138434 2559 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138437 2559 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:38:50.141758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138440 2559 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138443 2559 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138447 2559 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138449 2559 flags.go:64] FLAG: --help="false" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138452 2559 flags.go:64] FLAG: --hostname-override="ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138456 2559 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138459 2559 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138462 2559 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138465 2559 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138468 2559 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138472 2559 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138474 2559 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138477 2559 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138480 2559 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138483 2559 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138486 2559 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138489 2559 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138492 2559 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138496 2559 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138499 2559 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138502 2559 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138506 2559 flags.go:64] FLAG: --lock-file="" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138509 2559 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138512 2559 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:38:50.142387 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138515 2559 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138521 2559 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138524 2559 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138527 2559 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138530 2559 flags.go:64] FLAG: --logging-format="text" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138533 2559 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138537 2559 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138540 2559 flags.go:64] FLAG: --manifest-url="" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138543 2559 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138548 2559 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138552 2559 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138556 2559 flags.go:64] FLAG: --max-pods="110" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138559 2559 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138562 2559 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138565 2559 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138568 2559 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138571 2559 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138574 2559 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138578 2559 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138586 2559 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138589 2559 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138592 2559 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138596 2559 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:38:50.142973 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138599 2559 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138605 2559 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138608 2559 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138612 2559 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138616 2559 flags.go:64] FLAG: --port="10250" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138619 2559 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138622 2559 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-014e9dcd67f778af9" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138626 2559 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138628 2559 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138631 2559 flags.go:64] FLAG: --register-node="true" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138634 2559 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138637 2559 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138641 2559 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138644 2559 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138647 2559 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138650 2559 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138654 2559 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138657 2559 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138660 2559 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138663 2559 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138666 2559 flags.go:64] FLAG: --runonce="false" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138669 2559 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138672 2559 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138675 2559 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138678 2559 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138681 2559 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:38:50.143582 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138684 2559 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138687 2559 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138690 2559 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138693 2559 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138696 2559 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138699 2559 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138702 2559 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138705 2559 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138708 2559 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138713 2559 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138719 2559 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138722 2559 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138725 2559 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138729 2559 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138732 2559 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138735 2559 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138738 2559 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138741 2559 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138744 2559 flags.go:64] FLAG: --v="2" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138748 2559 flags.go:64] FLAG: --version="false" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138752 2559 flags.go:64] FLAG: --vmodule="" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138757 2559 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.138760 2559 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138851 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138854 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:50.144228 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138857 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138860 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138864 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138866 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138869 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138872 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138874 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138877 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138879 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138882 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138884 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138888 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138890 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138893 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138895 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138897 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138900 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138903 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138906 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138908 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:50.144840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138910 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138913 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138915 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138918 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138921 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138923 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138926 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138928 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138931 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138933 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138936 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138938 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138941 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138944 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138947 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138953 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138956 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138958 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138961 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138964 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:50.145380 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138966 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138969 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138971 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138974 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138976 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138979 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138981 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138984 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138987 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138989 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138992 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138994 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.138997 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139000 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139002 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139005 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139008 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139010 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139013 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:50.145896 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139015 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139018 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139021 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139023 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139026 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139028 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139031 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139034 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139037 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139042 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139046 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139049 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139052 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139055 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139057 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139060 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139063 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139065 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139068 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:50.146371 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139071 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:50.146874 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139074 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:50.146874 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139092 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:50.146874 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139098 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:50.146874 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139103 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:50.146874 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.139105 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:50.146874 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.139785 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:50.147366 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.147343 2559 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:38:50.147400 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.147368 2559 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:38:50.147432 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147417 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:50.147432 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147426 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:50.147432 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147429 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:50.147432 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147433 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147436 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147439 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147443 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147446 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147448 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147451 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147454 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147457 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147459 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147462 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147465 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147467 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147470 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147473 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147476 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147479 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147481 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147484 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147487 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:50.147534 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147489 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147491 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147494 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147497 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147500 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147502 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147505 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147509 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147513 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147516 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147519 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147522 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147525 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147528 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147531 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147534 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147537 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147540 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147543 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:50.148015 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147546 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147548 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147551 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147554 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147557 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147559 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147562 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147565 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147567 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147570 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147573 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147575 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147578 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147581 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147583 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147586 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147589 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147592 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147594 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147598 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:50.148492 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147600 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147603 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147606 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147608 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147611 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147613 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147616 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147619 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147622 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147625 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147627 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147630 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147632 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147635 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147637 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147640 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147643 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147645 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147648 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147650 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:50.148983 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147653 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147656 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147658 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147661 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.147666 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147768 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147773 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147776 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147779 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147782 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147786 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147790 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147793 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147796 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147799 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:50.149497 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147802 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147805 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147807 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147810 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147813 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147816 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147819 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147821 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147824 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147826 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147829 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147831 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147834 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147836 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147839 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147841 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147845 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147849 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147852 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147855 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:50.149871 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147857 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147860 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147862 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147865 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147868 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147870 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147873 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147875 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147878 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147880 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147883 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147885 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147888 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147891 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147893 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147896 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147899 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147902 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147904 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147907 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:50.150385 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147909 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147912 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147915 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147917 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147920 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147922 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147925 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147927 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147930 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147932 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147935 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147937 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147940 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147943 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147945 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147947 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147950 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147953 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147955 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147958 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:50.150905 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147960 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147963 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147965 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147968 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147970 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147973 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147976 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147979 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147982 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147984 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147987 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147989 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147992 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147995 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.147997 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:50.151575 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:50.148000 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:50.151979 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.148005 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:50.151979 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.148743 2559 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:38:50.151979 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.151790 2559 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:38:50.152812 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.152800 2559 server.go:1019] "Starting client certificate rotation" Apr 24 16:38:50.152909 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.152896 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:38:50.152948 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.152930 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:38:50.179227 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.179198 2559 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:38:50.181061 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.181039 2559 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:38:50.199098 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.199065 2559 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:38:50.205941 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.205921 2559 log.go:25] "Validated CRI v1 image API" Apr 24 16:38:50.207691 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.207676 2559 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:38:50.208975 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.208958 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:38:50.214706 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.214681 2559 fs.go:135] Filesystem UUIDs: map[0f10384b-9505-42a6-a737-a5a1d0ba4fa0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 94a484dd-7ace-4738-b8d4-1755995393e3:/dev/nvme0n1p4] Apr 24 16:38:50.214780 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.214704 2559 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:38:50.221609 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.221473 2559 manager.go:217] Machine: {Timestamp:2026-04-24 16:38:50.220107265 +0000 UTC m=+0.434831633 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3172041 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec234a881851de2f3a53cff544b78d87 SystemUUID:ec234a88-1851-de2f-3a53-cff544b78d87 BootID:0b9647f0-7778-4a26-b344-e49a9b0f60c0 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4f:59:f4:80:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4f:59:f4:80:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:11:6c:f9:ec:fa Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:38:50.221713 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.221617 2559 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:38:50.221759 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.221711 2559 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:38:50.222256 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.222232 2559 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:38:50.222403 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.222259 2559 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-179.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:38:50.222445 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.222413 2559 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:38:50.222445 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.222423 2559 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:38:50.222445 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.222436 2559 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:38:50.223260 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.223250 2559 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:38:50.224688 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.224677 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:38:50.224796 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.224787 2559 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:38:50.227284 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.227273 2559 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:38:50.227323 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.227288 2559 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:38:50.227323 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.227302 2559 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:38:50.227323 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.227312 2559 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:38:50.227323 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.227321 2559 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:38:50.228479 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.228465 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:38:50.228527 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.228485 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:38:50.231113 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.231093 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9b298" Apr 24 16:38:50.231617 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.231595 2559 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:38:50.233592 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.233579 2559 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:38:50.234891 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234880 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:38:50.234952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234898 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:38:50.234952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234904 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:38:50.234952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234910 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:38:50.234952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234916 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:38:50.234952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234922 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:38:50.234952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234928 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:38:50.234952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234939 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:38:50.234952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234947 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:38:50.235170 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234958 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:38:50.235170 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234971 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:38:50.235170 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.234979 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:38:50.235889 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.235879 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:38:50.235889 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.235890 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:38:50.239278 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.239257 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9b298" Apr 24 16:38:50.239402 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.239389 2559 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:38:50.239453 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.239426 2559 server.go:1295] "Started kubelet" Apr 24 16:38:50.239607 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.239536 2559 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:38:50.239658 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.239587 2559 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:38:50.239767 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.239756 2559 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:38:50.240466 ip-10-0-137-179 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:38:50.241641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.241621 2559 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-179.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:38:50.241733 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.241711 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-179.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:38:50.241777 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.241710 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:38:50.242293 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.242269 2559 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:38:50.242818 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.242802 2559 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:38:50.246473 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.245459 2559 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-179.ec2.internal.18a95867aa7a8302 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-179.ec2.internal,UID:ip-10-0-137-179.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-179.ec2.internal,},FirstTimestamp:2026-04-24 16:38:50.239402754 +0000 UTC m=+0.454127121,LastTimestamp:2026-04-24 16:38:50.239402754 +0000 UTC m=+0.454127121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-179.ec2.internal,}" Apr 24 16:38:50.247342 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.247327 2559 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:38:50.247401 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.247358 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:38:50.248312 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.248291 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:50.249110 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.248419 2559 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:38:50.249385 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.249360 2559 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:38:50.249385 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.249385 2559 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:38:50.249572 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.249559 2559 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:38:50.249620 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.249573 2559 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:38:50.250438 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.250421 2559 factory.go:55] Registering systemd factory Apr 24 16:38:50.250518 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.250447 2559 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:38:50.250739 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.250724 2559 factory.go:153] Registering CRI-O factory Apr 24 16:38:50.250822 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.250750 2559 factory.go:223] Registration of the crio container factory successfully Apr 24 16:38:50.250822 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.250808 2559 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:38:50.250941 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.250816 2559 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:38:50.250941 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.250848 2559 factory.go:103] Registering Raw factory Apr 24 16:38:50.250941 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.250864 2559 manager.go:1196] Started watching for new ooms in manager Apr 24 16:38:50.253050 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.253030 2559 manager.go:319] Starting recovery of all containers Apr 24 16:38:50.260785 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.260658 2559 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:50.263369 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.263346 2559 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-179.ec2.internal\" not found" node="ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.265586 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.265570 2559 manager.go:324] Recovery completed Apr 24 16:38:50.269668 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.269652 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.273479 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.273457 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.273563 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.273499 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.273563 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.273515 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.274141 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.274074 2559 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:38:50.274141 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.274139 2559 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:38:50.274242 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.274156 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:38:50.276861 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.276849 2559 policy_none.go:49] "None policy: Start" Apr 24 16:38:50.276908 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.276864 2559 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:38:50.276908 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.276873 2559 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:38:50.330726 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.330708 2559 manager.go:341] "Starting Device Plugin manager" Apr 24 16:38:50.330878 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.330744 2559 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:38:50.330878 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.330754 2559 server.go:85] "Starting device plugin registration server" Apr 24 16:38:50.331019 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.331008 2559 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:38:50.331070 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.331021 2559 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:38:50.331132 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.331115 2559 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:38:50.331205 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.331190 2559 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:38:50.331205 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.331202 2559 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:38:50.331750 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.331727 2559 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:38:50.331825 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.331769 2559 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:50.402347 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.402241 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:38:50.403472 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.403442 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:38:50.403543 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.403479 2559 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:38:50.403543 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.403511 2559 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:38:50.403543 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.403529 2559 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:38:50.403683 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.403572 2559 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:38:50.407328 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.407307 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:50.431372 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.431346 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.432522 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.432504 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.432604 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.432535 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.432604 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.432545 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.432604 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.432569 2559 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.438757 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.438738 2559 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.438826 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.438765 2559 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-179.ec2.internal\": node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:50.458756 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.458731 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:50.504051 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.504008 2559 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal"] Apr 24 16:38:50.504157 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.504111 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.505807 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.505790 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.505866 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.505825 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.505866 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.505839 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.507196 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.507183 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.507333 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.507318 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.507384 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.507346 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.507872 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.507857 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.507946 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.507872 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.507946 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.507886 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.507946 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.507897 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.507946 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.507898 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.507946 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.507910 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.509153 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.509137 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.509215 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.509176 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.509803 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.509786 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.509976 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.509822 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.509976 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.509836 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.524631 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.524605 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-179.ec2.internal\" not found" node="ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.527765 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.527750 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-179.ec2.internal\" not found" node="ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.552147 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.552114 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5aeb36898807b4f178264985b1daa831-config\") pod \"kube-apiserver-proxy-ip-10-0-137-179.ec2.internal\" (UID: \"5aeb36898807b4f178264985b1daa831\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.552293 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.552150 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/944f81ceadc2b89c85f169eae3ca4b09-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal\" (UID: \"944f81ceadc2b89c85f169eae3ca4b09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.552293 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.552176 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/944f81ceadc2b89c85f169eae3ca4b09-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal\" (UID: \"944f81ceadc2b89c85f169eae3ca4b09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.559006 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.558983 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:50.652794 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.652715 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5aeb36898807b4f178264985b1daa831-config\") pod \"kube-apiserver-proxy-ip-10-0-137-179.ec2.internal\" (UID: \"5aeb36898807b4f178264985b1daa831\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.652794 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.652747 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/944f81ceadc2b89c85f169eae3ca4b09-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal\" (UID: \"944f81ceadc2b89c85f169eae3ca4b09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.652794 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.652774 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/944f81ceadc2b89c85f169eae3ca4b09-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal\" (UID: \"944f81ceadc2b89c85f169eae3ca4b09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.652983 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.652834 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/944f81ceadc2b89c85f169eae3ca4b09-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal\" (UID: \"944f81ceadc2b89c85f169eae3ca4b09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.652983 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.652846 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/944f81ceadc2b89c85f169eae3ca4b09-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal\" (UID: \"944f81ceadc2b89c85f169eae3ca4b09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.652983 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.652834 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5aeb36898807b4f178264985b1daa831-config\") pod \"kube-apiserver-proxy-ip-10-0-137-179.ec2.internal\" (UID: \"5aeb36898807b4f178264985b1daa831\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.659813 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.659793 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:50.760842 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.760796 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:50.827048 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.827011 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.830107 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:50.830093 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" Apr 24 16:38:50.861522 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.861443 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:50.962059 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:50.961969 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:51.062586 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:51.062548 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:51.153140 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.153104 2559 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:38:51.153772 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.153267 2559 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:38:51.153772 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.153290 2559 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:38:51.163434 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:51.163400 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:51.242127 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.242030 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:33:50 +0000 UTC" deadline="2027-11-29 12:42:01.343623415 +0000 UTC" Apr 24 16:38:51.242127 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.242064 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14012h3m10.101562157s" Apr 24 16:38:51.247854 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.247837 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:38:51.262288 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.262264 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:38:51.263977 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:51.263947 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:51.283161 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.283124 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4cw9d" Apr 24 16:38:51.291160 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.291135 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4cw9d" Apr 24 16:38:51.364218 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:51.364187 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:51.381755 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:51.381716 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f81ceadc2b89c85f169eae3ca4b09.slice/crio-9f8add97ab0ba467e3155a470b46a92e50d0cab9bed3065a52b26b3cac4a71f2 WatchSource:0}: Error finding container 9f8add97ab0ba467e3155a470b46a92e50d0cab9bed3065a52b26b3cac4a71f2: Status 404 returned error can't find the container with id 9f8add97ab0ba467e3155a470b46a92e50d0cab9bed3065a52b26b3cac4a71f2 Apr 24 16:38:51.382179 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:51.382154 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aeb36898807b4f178264985b1daa831.slice/crio-a3aba1430a4f66188d87b920b1fbc978332415ae3a0b52f09d76953fcddd1a81 WatchSource:0}: Error finding container a3aba1430a4f66188d87b920b1fbc978332415ae3a0b52f09d76953fcddd1a81: Status 404 returned error can't find the container with id a3aba1430a4f66188d87b920b1fbc978332415ae3a0b52f09d76953fcddd1a81 Apr 24 16:38:51.385888 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.385860 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:38:51.406406 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.406351 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" event={"ID":"944f81ceadc2b89c85f169eae3ca4b09","Type":"ContainerStarted","Data":"9f8add97ab0ba467e3155a470b46a92e50d0cab9bed3065a52b26b3cac4a71f2"} Apr 24 16:38:51.407165 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.407145 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal" event={"ID":"5aeb36898807b4f178264985b1daa831","Type":"ContainerStarted","Data":"a3aba1430a4f66188d87b920b1fbc978332415ae3a0b52f09d76953fcddd1a81"} Apr 24 16:38:51.464689 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:51.464657 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:51.565327 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:51.565245 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:51.665782 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:51.665752 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:51.766730 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:51.766694 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-179.ec2.internal\" not found" Apr 24 16:38:51.778809 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.778633 2559 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:51.789785 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.789754 2559 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:51.848323 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.848240 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal" Apr 24 16:38:51.861229 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.861188 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:38:51.862207 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.862185 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" Apr 24 16:38:51.872728 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.872707 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:38:51.971910 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.971731 2559 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:51.980233 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:51.980208 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:52.228842 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.228763 2559 apiserver.go:52] "Watching apiserver" Apr 24 16:38:52.234786 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.234763 2559 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:38:52.235200 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.235177 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-47g4c","openshift-multus/multus-additional-cni-plugins-8gmjx","openshift-multus/network-metrics-daemon-xlwrd","kube-system/konnectivity-agent-86w64","kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal","openshift-cluster-node-tuning-operator/tuned-lkl8j","openshift-image-registry/node-ca-v6j2z","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal","openshift-network-diagnostics/network-check-target-kddtl","openshift-network-operator/iptables-alerter-fg8mz","openshift-ovn-kubernetes/ovnkube-node-7f9px","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9","openshift-dns/node-resolver-7t5b9"] Apr 24 16:38:52.238343 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.238319 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.240551 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.240524 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.240982 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.240959 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:38:52.241099 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.240968 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.241099 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.240964 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.241099 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.241007 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lqszl\"" Apr 24 16:38:52.242708 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.242688 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:52.242807 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.242770 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:38:52.242895 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.242871 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:38:52.243072 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.243049 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:38:52.243169 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.243121 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.243450 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.243423 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gc979\"" Apr 24 16:38:52.243548 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.243493 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.244811 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.244794 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-86w64" Apr 24 16:38:52.245219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.245195 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:38:52.247011 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.246991 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.247336 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.247321 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8md7w\"" Apr 24 16:38:52.247423 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.247350 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:38:52.247742 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.247724 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:38:52.249065 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.249047 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.250855 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.250165 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.250855 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.250529 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tgjhv\"" Apr 24 16:38:52.252344 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.251355 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.252344 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.252198 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-p7chn\"" Apr 24 16:38:52.252494 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.252443 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:38:52.253134 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.252701 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:38:52.253134 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.252803 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:38:52.255346 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.255323 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.257783 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.257762 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.257783 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.257780 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:38:52.257982 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.257961 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.258063 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.257985 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jgsvz\"" Apr 24 16:38:52.258063 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.258026 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.260325 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.260306 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.261024 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261007 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:38:52.261210 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261193 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dvkg8\"" Apr 24 16:38:52.261407 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261387 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:38:52.261510 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261429 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.261647 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261626 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ac5de78b-4789-4f34-988c-b131198ef66b-konnectivity-ca\") pod \"konnectivity-agent-86w64\" (UID: \"ac5de78b-4789-4f34-988c-b131198ef66b\") " pod="kube-system/konnectivity-agent-86w64" Apr 24 16:38:52.261647 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261641 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:38:52.261788 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261659 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-systemd\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.261788 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261685 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/af7a310a-32d2-48cf-b2d6-69c07daae4b0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.261788 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261711 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n48xg\" (UniqueName: \"kubernetes.io/projected/9a75b238-62f3-4139-9303-b235113baa9d-kube-api-access-n48xg\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:52.261788 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261733 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-system-cni-dir\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.261788 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261754 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-var-lib-cni-multus\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.261788 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261778 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbgl2\" (UniqueName: \"kubernetes.io/projected/d5140d31-beb5-42da-bdf5-60b7bfc41f79-kube-api-access-xbgl2\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261795 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7282f227-12bf-4645-b003-1131e4895ca0-host\") pod \"node-ca-v6j2z\" (UID: \"7282f227-12bf-4645-b003-1131e4895ca0\") " pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261812 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-os-release\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261828 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af7a310a-32d2-48cf-b2d6-69c07daae4b0-cni-binary-copy\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261843 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/af7a310a-32d2-48cf-b2d6-69c07daae4b0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261862 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-sys\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261877 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-lib-modules\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261889 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.261892 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-var-lib-kubelet\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262023 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh9c7\" (UniqueName: \"kubernetes.io/projected/513c5343-09a1-48a6-8da2-774588de4d58-kube-api-access-sh9c7\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262050 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-cnibin\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262097 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-daemon-config\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262122 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-system-cni-dir\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262147 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-sysctl-conf\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262172 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/513c5343-09a1-48a6-8da2-774588de4d58-etc-tuned\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262193 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-os-release\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262205 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262217 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-run-k8s-cni-cncf-io\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262256 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-var-lib-kubelet\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262321 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-hostroot\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262359 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-etc-kubernetes\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262386 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzrvn\" (UniqueName: \"kubernetes.io/projected/af7a310a-32d2-48cf-b2d6-69c07daae4b0-kube-api-access-hzrvn\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262407 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-sysconfig\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262450 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-kubernetes\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262474 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/513c5343-09a1-48a6-8da2-774588de4d58-tmp\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262508 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-socket-dir-parent\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262545 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-run-netns\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262556 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-cgfwf\"" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262578 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-var-lib-cni-bin\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262608 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262630 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262647 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-host\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.262902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262663 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-cnibin\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262685 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-modprobe-d\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262705 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-sysctl-d\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262728 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5140d31-beb5-42da-bdf5-60b7bfc41f79-cni-binary-copy\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262750 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n9sh\" (UniqueName: \"kubernetes.io/projected/7282f227-12bf-4645-b003-1131e4895ca0-kube-api-access-6n9sh\") pod \"node-ca-v6j2z\" (UID: \"7282f227-12bf-4645-b003-1131e4895ca0\") " pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262790 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-run\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262817 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-cni-dir\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262862 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-conf-dir\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262887 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262887 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-run-multus-certs\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262920 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7282f227-12bf-4645-b003-1131e4895ca0-serviceca\") pod \"node-ca-v6j2z\" (UID: \"7282f227-12bf-4645-b003-1131e4895ca0\") " pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262947 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ac5de78b-4789-4f34-988c-b131198ef66b-agent-certs\") pod \"konnectivity-agent-86w64\" (UID: \"ac5de78b-4789-4f34-988c-b131198ef66b\") " pod="kube-system/konnectivity-agent-86w64" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262896 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.262892 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.263641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.263239 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.265211 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.265189 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.265438 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.265371 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mpw6l\"" Apr 24 16:38:52.265438 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.265412 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.291784 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.291751 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:33:51 +0000 UTC" deadline="2027-12-05 05:37:55.314639907 +0000 UTC" Apr 24 16:38:52.291784 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.291783 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14148h59m3.022860452s" Apr 24 16:38:52.349280 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.349251 2559 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:38:52.363496 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363465 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-run-netns\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.363656 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363501 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d355f77-a36f-48e8-a168-c520805efc91-ovn-node-metrics-cert\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.363656 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363523 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-registration-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.363656 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363552 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-host\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.363656 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363629 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-host\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.363656 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363641 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-run-systemd\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.363897 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363678 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-run-ovn\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.363897 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363708 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-modprobe-d\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.363897 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363784 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.363897 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363822 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eec44da0-010a-4044-830b-8f9776a91747-hosts-file\") pod \"node-resolver-7t5b9\" (UID: \"eec44da0-010a-4044-830b-8f9776a91747\") " pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.363897 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363839 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-modprobe-d\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.363897 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363846 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-run\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.363897 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363880 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-conf-dir\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.363897 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363889 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-run\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.364225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363913 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-run-multus-certs\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363956 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-conf-dir\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.363967 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7282f227-12bf-4645-b003-1131e4895ca0-serviceca\") pod \"node-ca-v6j2z\" (UID: \"7282f227-12bf-4645-b003-1131e4895ca0\") " pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.364225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364027 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-run-multus-certs\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364096 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ac5de78b-4789-4f34-988c-b131198ef66b-konnectivity-ca\") pod \"konnectivity-agent-86w64\" (UID: \"ac5de78b-4789-4f34-988c-b131198ef66b\") " pod="kube-system/konnectivity-agent-86w64" Apr 24 16:38:52.364225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364141 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n48xg\" (UniqueName: \"kubernetes.io/projected/9a75b238-62f3-4139-9303-b235113baa9d-kube-api-access-n48xg\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:52.364225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364173 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-log-socket\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.364225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364199 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-system-cni-dir\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364224 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbgl2\" (UniqueName: \"kubernetes.io/projected/d5140d31-beb5-42da-bdf5-60b7bfc41f79-kube-api-access-xbgl2\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364250 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-os-release\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364276 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af7a310a-32d2-48cf-b2d6-69c07daae4b0-cni-binary-copy\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364304 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/af7a310a-32d2-48cf-b2d6-69c07daae4b0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364332 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d355f77-a36f-48e8-a168-c520805efc91-ovnkube-script-lib\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364356 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364382 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-etc-selinux\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364409 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-sys\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364434 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-lib-modules\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364458 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-var-lib-kubelet\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364483 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-node-log\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364516 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7282f227-12bf-4645-b003-1131e4895ca0-serviceca\") pod \"node-ca-v6j2z\" (UID: \"7282f227-12bf-4645-b003-1131e4895ca0\") " pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.364555 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364526 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-run-ovn-kubernetes\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364560 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-cni-bin\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364593 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-sys-fs\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364599 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ac5de78b-4789-4f34-988c-b131198ef66b-konnectivity-ca\") pod \"konnectivity-agent-86w64\" (UID: \"ac5de78b-4789-4f34-988c-b131198ef66b\") " pod="kube-system/konnectivity-agent-86w64" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364628 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-os-release\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364689 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-sys\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364707 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-var-lib-kubelet\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364745 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-os-release\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364789 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-os-release\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364845 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-var-lib-kubelet\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364873 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-system-cni-dir\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364881 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-run-openvswitch\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364905 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-lib-modules\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364910 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-sysconfig\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364938 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/513c5343-09a1-48a6-8da2-774588de4d58-tmp\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364939 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-var-lib-kubelet\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364980 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-socket-dir-parent\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.364997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.364978 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-sysconfig\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365008 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-var-lib-cni-bin\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365039 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-etc-openvswitch\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365044 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-socket-dir-parent\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365070 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-var-lib-cni-bin\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365074 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpfbf\" (UniqueName: \"kubernetes.io/projected/bce0b728-ae34-46c9-af1b-d6af180d5c23-kube-api-access-fpfbf\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365117 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r72n4\" (UniqueName: \"kubernetes.io/projected/3cf6c854-7267-4e50-9e1d-4fdd84304910-kube-api-access-r72n4\") pod \"iptables-alerter-fg8mz\" (UID: \"3cf6c854-7267-4e50-9e1d-4fdd84304910\") " pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365137 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-cnibin\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365142 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/af7a310a-32d2-48cf-b2d6-69c07daae4b0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365156 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-socket-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365171 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eec44da0-010a-4044-830b-8f9776a91747-tmp-dir\") pod \"node-resolver-7t5b9\" (UID: \"eec44da0-010a-4044-830b-8f9776a91747\") " pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365198 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-cnibin\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365209 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-sysctl-d\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365243 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5140d31-beb5-42da-bdf5-60b7bfc41f79-cni-binary-copy\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365242 2559 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365274 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n9sh\" (UniqueName: \"kubernetes.io/projected/7282f227-12bf-4645-b003-1131e4895ca0-kube-api-access-6n9sh\") pod \"node-ca-v6j2z\" (UID: \"7282f227-12bf-4645-b003-1131e4895ca0\") " pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365305 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-systemd-units\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.365661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365330 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-cni-netd\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365332 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-sysctl-d\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365367 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w4bm\" (UniqueName: \"kubernetes.io/projected/4d355f77-a36f-48e8-a168-c520805efc91-kube-api-access-8w4bm\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365395 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6m45\" (UniqueName: \"kubernetes.io/projected/eec44da0-010a-4044-830b-8f9776a91747-kube-api-access-p6m45\") pod \"node-resolver-7t5b9\" (UID: \"eec44da0-010a-4044-830b-8f9776a91747\") " pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365442 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-cni-dir\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365466 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-device-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365506 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ac5de78b-4789-4f34-988c-b131198ef66b-agent-certs\") pod \"konnectivity-agent-86w64\" (UID: \"ac5de78b-4789-4f34-988c-b131198ef66b\") " pod="kube-system/konnectivity-agent-86w64" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365540 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-cni-dir\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365571 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af7a310a-32d2-48cf-b2d6-69c07daae4b0-cni-binary-copy\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365580 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-systemd\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365680 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/af7a310a-32d2-48cf-b2d6-69c07daae4b0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365712 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365721 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-systemd\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365735 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-var-lib-openvswitch\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365766 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d355f77-a36f-48e8-a168-c520805efc91-ovnkube-config\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365788 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3cf6c854-7267-4e50-9e1d-4fdd84304910-host-slash\") pod \"iptables-alerter-fg8mz\" (UID: \"3cf6c854-7267-4e50-9e1d-4fdd84304910\") " pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365793 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5140d31-beb5-42da-bdf5-60b7bfc41f79-cni-binary-copy\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.366474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365812 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-var-lib-cni-multus\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365844 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7282f227-12bf-4645-b003-1131e4895ca0-host\") pod \"node-ca-v6j2z\" (UID: \"7282f227-12bf-4645-b003-1131e4895ca0\") " pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365856 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-var-lib-cni-multus\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365882 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-slash\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365891 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7282f227-12bf-4645-b003-1131e4895ca0-host\") pod \"node-ca-v6j2z\" (UID: \"7282f227-12bf-4645-b003-1131e4895ca0\") " pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365911 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3cf6c854-7267-4e50-9e1d-4fdd84304910-iptables-alerter-script\") pod \"iptables-alerter-fg8mz\" (UID: \"3cf6c854-7267-4e50-9e1d-4fdd84304910\") " pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365940 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sh9c7\" (UniqueName: \"kubernetes.io/projected/513c5343-09a1-48a6-8da2-774588de4d58-kube-api-access-sh9c7\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365967 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-cnibin\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.365993 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-daemon-config\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366018 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-system-cni-dir\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366045 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d355f77-a36f-48e8-a168-c520805efc91-env-overrides\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366050 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-cnibin\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366070 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-sysctl-conf\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366113 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/513c5343-09a1-48a6-8da2-774588de4d58-etc-tuned\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366138 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-run-k8s-cni-cncf-io\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366162 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-hostroot\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366185 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-etc-kubernetes\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.367314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366200 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/af7a310a-32d2-48cf-b2d6-69c07daae4b0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366211 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzrvn\" (UniqueName: \"kubernetes.io/projected/af7a310a-32d2-48cf-b2d6-69c07daae4b0-kube-api-access-hzrvn\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366237 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-kubelet\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366263 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-kubernetes\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366265 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-hostroot\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366115 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-system-cni-dir\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366288 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-run-netns\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366309 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366319 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-run-k8s-cni-cncf-io\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366325 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.366390 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366422 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-kubernetes\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366468 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-etc-kubernetes\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.366483 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs podName:9a75b238-62f3-4139-9303-b235113baa9d nodeName:}" failed. No retries permitted until 2026-04-24 16:38:52.866426026 +0000 UTC m=+3.081150387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs") pod "network-metrics-daemon-xlwrd" (UID: "9a75b238-62f3-4139-9303-b235113baa9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366618 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af7a310a-32d2-48cf-b2d6-69c07daae4b0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366666 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5140d31-beb5-42da-bdf5-60b7bfc41f79-host-run-netns\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.366715 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/513c5343-09a1-48a6-8da2-774588de4d58-etc-sysctl-conf\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.368104 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.367289 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5140d31-beb5-42da-bdf5-60b7bfc41f79-multus-daemon-config\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.368841 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.368825 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/513c5343-09a1-48a6-8da2-774588de4d58-tmp\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.369119 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.369101 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ac5de78b-4789-4f34-988c-b131198ef66b-agent-certs\") pod \"konnectivity-agent-86w64\" (UID: \"ac5de78b-4789-4f34-988c-b131198ef66b\") " pod="kube-system/konnectivity-agent-86w64" Apr 24 16:38:52.369234 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.369151 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/513c5343-09a1-48a6-8da2-774588de4d58-etc-tuned\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.378075 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.378047 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbgl2\" (UniqueName: \"kubernetes.io/projected/d5140d31-beb5-42da-bdf5-60b7bfc41f79-kube-api-access-xbgl2\") pod \"multus-47g4c\" (UID: \"d5140d31-beb5-42da-bdf5-60b7bfc41f79\") " pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.378711 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.378687 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n48xg\" (UniqueName: \"kubernetes.io/projected/9a75b238-62f3-4139-9303-b235113baa9d-kube-api-access-n48xg\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:52.378835 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.378818 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n9sh\" (UniqueName: \"kubernetes.io/projected/7282f227-12bf-4645-b003-1131e4895ca0-kube-api-access-6n9sh\") pod \"node-ca-v6j2z\" (UID: \"7282f227-12bf-4645-b003-1131e4895ca0\") " pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.378961 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.378934 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzrvn\" (UniqueName: \"kubernetes.io/projected/af7a310a-32d2-48cf-b2d6-69c07daae4b0-kube-api-access-hzrvn\") pod \"multus-additional-cni-plugins-8gmjx\" (UID: \"af7a310a-32d2-48cf-b2d6-69c07daae4b0\") " pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.379176 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.379160 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh9c7\" (UniqueName: \"kubernetes.io/projected/513c5343-09a1-48a6-8da2-774588de4d58-kube-api-access-sh9c7\") pod \"tuned-lkl8j\" (UID: \"513c5343-09a1-48a6-8da2-774588de4d58\") " pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.467091 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467044 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-run-ovn-kubernetes\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467100 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-cni-bin\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467120 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-sys-fs\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467157 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-run-openvswitch\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467162 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-run-ovn-kubernetes\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467179 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-etc-openvswitch\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467162 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-cni-bin\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467205 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpfbf\" (UniqueName: \"kubernetes.io/projected/bce0b728-ae34-46c9-af1b-d6af180d5c23-kube-api-access-fpfbf\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467220 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-sys-fs\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467233 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r72n4\" (UniqueName: \"kubernetes.io/projected/3cf6c854-7267-4e50-9e1d-4fdd84304910-kube-api-access-r72n4\") pod \"iptables-alerter-fg8mz\" (UID: \"3cf6c854-7267-4e50-9e1d-4fdd84304910\") " pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467243 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-run-openvswitch\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467260 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-socket-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.467275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467270 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-etc-openvswitch\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467280 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eec44da0-010a-4044-830b-8f9776a91747-tmp-dir\") pod \"node-resolver-7t5b9\" (UID: \"eec44da0-010a-4044-830b-8f9776a91747\") " pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467335 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-systemd-units\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467362 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-cni-netd\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467388 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w4bm\" (UniqueName: \"kubernetes.io/projected/4d355f77-a36f-48e8-a168-c520805efc91-kube-api-access-8w4bm\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467412 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6m45\" (UniqueName: \"kubernetes.io/projected/eec44da0-010a-4044-830b-8f9776a91747-kube-api-access-p6m45\") pod \"node-resolver-7t5b9\" (UID: \"eec44da0-010a-4044-830b-8f9776a91747\") " pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467423 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-systemd-units\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467431 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-socket-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467451 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-cni-netd\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467552 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-device-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467567 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eec44da0-010a-4044-830b-8f9776a91747-tmp-dir\") pod \"node-resolver-7t5b9\" (UID: \"eec44da0-010a-4044-830b-8f9776a91747\") " pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467597 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467613 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-var-lib-openvswitch\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467623 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-device-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467630 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d355f77-a36f-48e8-a168-c520805efc91-ovnkube-config\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467655 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3cf6c854-7267-4e50-9e1d-4fdd84304910-host-slash\") pod \"iptables-alerter-fg8mz\" (UID: \"3cf6c854-7267-4e50-9e1d-4fdd84304910\") " pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467699 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-var-lib-openvswitch\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.467863 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467701 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3cf6c854-7267-4e50-9e1d-4fdd84304910-host-slash\") pod \"iptables-alerter-fg8mz\" (UID: \"3cf6c854-7267-4e50-9e1d-4fdd84304910\") " pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467747 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-slash\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467766 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3cf6c854-7267-4e50-9e1d-4fdd84304910-iptables-alerter-script\") pod \"iptables-alerter-fg8mz\" (UID: \"3cf6c854-7267-4e50-9e1d-4fdd84304910\") " pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467790 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d355f77-a36f-48e8-a168-c520805efc91-env-overrides\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467813 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-slash\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467823 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-kubelet\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467856 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-kubelet\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467868 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-run-netns\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467892 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d355f77-a36f-48e8-a168-c520805efc91-ovn-node-metrics-cert\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467908 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-registration-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467929 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-run-systemd\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467928 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-run-netns\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.467985 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-run-systemd\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468052 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-run-ovn\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468065 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-registration-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468117 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-run-ovn\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468114 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.468640 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468151 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468154 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eec44da0-010a-4044-830b-8f9776a91747-hosts-file\") pod \"node-resolver-7t5b9\" (UID: \"eec44da0-010a-4044-830b-8f9776a91747\") " pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468200 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-log-socket\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468214 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d355f77-a36f-48e8-a168-c520805efc91-ovnkube-config\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468233 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d355f77-a36f-48e8-a168-c520805efc91-ovnkube-script-lib\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468254 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d355f77-a36f-48e8-a168-c520805efc91-env-overrides\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468257 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-log-socket\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468257 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eec44da0-010a-4044-830b-8f9776a91747-hosts-file\") pod \"node-resolver-7t5b9\" (UID: \"eec44da0-010a-4044-830b-8f9776a91747\") " pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468278 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468310 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-etc-selinux\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468340 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-node-log\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468341 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3cf6c854-7267-4e50-9e1d-4fdd84304910-iptables-alerter-script\") pod \"iptables-alerter-fg8mz\" (UID: \"3cf6c854-7267-4e50-9e1d-4fdd84304910\") " pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468353 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468398 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d355f77-a36f-48e8-a168-c520805efc91-node-log\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468455 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bce0b728-ae34-46c9-af1b-d6af180d5c23-etc-selinux\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.469218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.468736 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d355f77-a36f-48e8-a168-c520805efc91-ovnkube-script-lib\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.470374 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.470355 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d355f77-a36f-48e8-a168-c520805efc91-ovn-node-metrics-cert\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.480066 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.479990 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:52.480066 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.480020 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:52.480066 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.480036 2559 projected.go:194] Error preparing data for projected volume kube-api-access-dq6rz for pod openshift-network-diagnostics/network-check-target-kddtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:52.480327 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.480212 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz podName:f55c507c-3b62-4dd9-9a36-9fd09f9412cb nodeName:}" failed. No retries permitted until 2026-04-24 16:38:52.980191155 +0000 UTC m=+3.194915535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dq6rz" (UniqueName: "kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz") pod "network-check-target-kddtl" (UID: "f55c507c-3b62-4dd9-9a36-9fd09f9412cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:52.481704 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.481678 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r72n4\" (UniqueName: \"kubernetes.io/projected/3cf6c854-7267-4e50-9e1d-4fdd84304910-kube-api-access-r72n4\") pod \"iptables-alerter-fg8mz\" (UID: \"3cf6c854-7267-4e50-9e1d-4fdd84304910\") " pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.482573 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.482554 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpfbf\" (UniqueName: \"kubernetes.io/projected/bce0b728-ae34-46c9-af1b-d6af180d5c23-kube-api-access-fpfbf\") pod \"aws-ebs-csi-driver-node-r98c9\" (UID: \"bce0b728-ae34-46c9-af1b-d6af180d5c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.482854 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.482824 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6m45\" (UniqueName: \"kubernetes.io/projected/eec44da0-010a-4044-830b-8f9776a91747-kube-api-access-p6m45\") pod \"node-resolver-7t5b9\" (UID: \"eec44da0-010a-4044-830b-8f9776a91747\") " pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.483057 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.483039 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w4bm\" (UniqueName: \"kubernetes.io/projected/4d355f77-a36f-48e8-a168-c520805efc91-kube-api-access-8w4bm\") pod \"ovnkube-node-7f9px\" (UID: \"4d355f77-a36f-48e8-a168-c520805efc91\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.555273 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.555225 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v6j2z" Apr 24 16:38:52.563063 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.563036 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" Apr 24 16:38:52.570834 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.570813 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-86w64" Apr 24 16:38:52.575428 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.575401 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" Apr 24 16:38:52.581952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.581933 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-47g4c" Apr 24 16:38:52.589421 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.589403 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fg8mz" Apr 24 16:38:52.595045 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.595028 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:38:52.603614 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.603593 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" Apr 24 16:38:52.610222 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.609150 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7t5b9" Apr 24 16:38:52.870928 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:52.870835 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:52.871102 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.870971 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:52.871102 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:52.871074 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs podName:9a75b238-62f3-4139-9303-b235113baa9d nodeName:}" failed. No retries permitted until 2026-04-24 16:38:53.871024571 +0000 UTC m=+4.085748944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs") pod "network-metrics-daemon-xlwrd" (UID: "9a75b238-62f3-4139-9303-b235113baa9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:52.989719 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:52.989690 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5de78b_4789_4f34_988c_b131198ef66b.slice/crio-291ad0353675da6f254cc7e0ac4f46ea28627ee06b55399393e287da3c6cc39b WatchSource:0}: Error finding container 291ad0353675da6f254cc7e0ac4f46ea28627ee06b55399393e287da3c6cc39b: Status 404 returned error can't find the container with id 291ad0353675da6f254cc7e0ac4f46ea28627ee06b55399393e287da3c6cc39b Apr 24 16:38:52.992711 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:52.992683 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbce0b728_ae34_46c9_af1b_d6af180d5c23.slice/crio-f533c569d2cbd224fabdcc0b29e780302ed97a777fe4e08471a4dd18961610ae WatchSource:0}: Error finding container f533c569d2cbd224fabdcc0b29e780302ed97a777fe4e08471a4dd18961610ae: Status 404 returned error can't find the container with id f533c569d2cbd224fabdcc0b29e780302ed97a777fe4e08471a4dd18961610ae Apr 24 16:38:52.994214 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:52.994185 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf7a310a_32d2_48cf_b2d6_69c07daae4b0.slice/crio-2fe4d07721cacfde1e9591c0523d6d82756a474a0a5bf078290103500e13a2f7 WatchSource:0}: Error finding container 2fe4d07721cacfde1e9591c0523d6d82756a474a0a5bf078290103500e13a2f7: Status 404 returned error can't find the container with id 2fe4d07721cacfde1e9591c0523d6d82756a474a0a5bf078290103500e13a2f7 Apr 24 16:38:52.995014 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:52.994988 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeec44da0_010a_4044_830b_8f9776a91747.slice/crio-5153b60b7721c55978ae57d62e8db974eba2c2db81be0b6b293617bc594ef059 WatchSource:0}: Error finding container 5153b60b7721c55978ae57d62e8db974eba2c2db81be0b6b293617bc594ef059: Status 404 returned error can't find the container with id 5153b60b7721c55978ae57d62e8db974eba2c2db81be0b6b293617bc594ef059 Apr 24 16:38:52.995581 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:52.995546 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cf6c854_7267_4e50_9e1d_4fdd84304910.slice/crio-20a6e73e2e88877a3acd93c29a336793e80a47ee9b9c2b1ea12c92fe65c49b15 WatchSource:0}: Error finding container 20a6e73e2e88877a3acd93c29a336793e80a47ee9b9c2b1ea12c92fe65c49b15: Status 404 returned error can't find the container with id 20a6e73e2e88877a3acd93c29a336793e80a47ee9b9c2b1ea12c92fe65c49b15 Apr 24 16:38:52.997140 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:52.997042 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod513c5343_09a1_48a6_8da2_774588de4d58.slice/crio-ef2ddfe2f32e09754064dcf74aebf501eb411fdfeaf2ec4ed1924cba5e85d999 WatchSource:0}: Error finding container ef2ddfe2f32e09754064dcf74aebf501eb411fdfeaf2ec4ed1924cba5e85d999: Status 404 returned error can't find the container with id ef2ddfe2f32e09754064dcf74aebf501eb411fdfeaf2ec4ed1924cba5e85d999 Apr 24 16:38:52.997733 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:52.997693 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7282f227_12bf_4645_b003_1131e4895ca0.slice/crio-2607f06c0bcc010773f08405a5c2ac927c8e352d8bf5e73d0ac19a6383c1cc82 WatchSource:0}: Error finding container 2607f06c0bcc010773f08405a5c2ac927c8e352d8bf5e73d0ac19a6383c1cc82: Status 404 returned error can't find the container with id 2607f06c0bcc010773f08405a5c2ac927c8e352d8bf5e73d0ac19a6383c1cc82 Apr 24 16:38:52.999510 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:38:52.998998 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5140d31_beb5_42da_bdf5_60b7bfc41f79.slice/crio-9bc968038fc46a10c55305b8321852914778684180d1655070f5b3459815d955 WatchSource:0}: Error finding container 9bc968038fc46a10c55305b8321852914778684180d1655070f5b3459815d955: Status 404 returned error can't find the container with id 9bc968038fc46a10c55305b8321852914778684180d1655070f5b3459815d955 Apr 24 16:38:53.071967 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.071778 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:38:53.072062 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:53.071923 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:53.072062 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:53.072053 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:53.072168 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:53.072066 2559 projected.go:194] Error preparing data for projected volume kube-api-access-dq6rz for pod openshift-network-diagnostics/network-check-target-kddtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:53.072168 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:53.072135 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz podName:f55c507c-3b62-4dd9-9a36-9fd09f9412cb nodeName:}" failed. No retries permitted until 2026-04-24 16:38:54.072120456 +0000 UTC m=+4.286844810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dq6rz" (UniqueName: "kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz") pod "network-check-target-kddtl" (UID: "f55c507c-3b62-4dd9-9a36-9fd09f9412cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:53.293043 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.292924 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:33:51 +0000 UTC" deadline="2027-11-25 22:35:07.123317647 +0000 UTC" Apr 24 16:38:53.293043 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.292955 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13925h56m13.830364962s" Apr 24 16:38:53.404788 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.404749 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:53.404964 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:53.404904 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:38:53.411272 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.411208 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" event={"ID":"af7a310a-32d2-48cf-b2d6-69c07daae4b0","Type":"ContainerStarted","Data":"2fe4d07721cacfde1e9591c0523d6d82756a474a0a5bf078290103500e13a2f7"} Apr 24 16:38:53.415471 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.414691 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal" event={"ID":"5aeb36898807b4f178264985b1daa831","Type":"ContainerStarted","Data":"5f7d3602b8e90ebc0cae730108683084bab0e34eebfb65e1977ad0624e3110c0"} Apr 24 16:38:53.416645 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.416617 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" event={"ID":"bce0b728-ae34-46c9-af1b-d6af180d5c23","Type":"ContainerStarted","Data":"f533c569d2cbd224fabdcc0b29e780302ed97a777fe4e08471a4dd18961610ae"} Apr 24 16:38:53.418590 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.418562 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-86w64" event={"ID":"ac5de78b-4789-4f34-988c-b131198ef66b","Type":"ContainerStarted","Data":"291ad0353675da6f254cc7e0ac4f46ea28627ee06b55399393e287da3c6cc39b"} Apr 24 16:38:53.422418 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.422377 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerStarted","Data":"9423139a5543ca32c5e9f9e056bdf368cd0f3be15648d834d22f03ebd2cd4e03"} Apr 24 16:38:53.423961 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.423936 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-47g4c" event={"ID":"d5140d31-beb5-42da-bdf5-60b7bfc41f79","Type":"ContainerStarted","Data":"9bc968038fc46a10c55305b8321852914778684180d1655070f5b3459815d955"} Apr 24 16:38:53.427423 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.427375 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v6j2z" event={"ID":"7282f227-12bf-4645-b003-1131e4895ca0","Type":"ContainerStarted","Data":"2607f06c0bcc010773f08405a5c2ac927c8e352d8bf5e73d0ac19a6383c1cc82"} Apr 24 16:38:53.429379 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.429355 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" event={"ID":"513c5343-09a1-48a6-8da2-774588de4d58","Type":"ContainerStarted","Data":"ef2ddfe2f32e09754064dcf74aebf501eb411fdfeaf2ec4ed1924cba5e85d999"} Apr 24 16:38:53.433822 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.433772 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fg8mz" event={"ID":"3cf6c854-7267-4e50-9e1d-4fdd84304910","Type":"ContainerStarted","Data":"20a6e73e2e88877a3acd93c29a336793e80a47ee9b9c2b1ea12c92fe65c49b15"} Apr 24 16:38:53.436276 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.435420 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7t5b9" event={"ID":"eec44da0-010a-4044-830b-8f9776a91747","Type":"ContainerStarted","Data":"5153b60b7721c55978ae57d62e8db974eba2c2db81be0b6b293617bc594ef059"} Apr 24 16:38:53.878128 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:53.878096 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:53.878298 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:53.878247 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:53.878439 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:53.878314 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs podName:9a75b238-62f3-4139-9303-b235113baa9d nodeName:}" failed. No retries permitted until 2026-04-24 16:38:55.878294162 +0000 UTC m=+6.093018517 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs") pod "network-metrics-daemon-xlwrd" (UID: "9a75b238-62f3-4139-9303-b235113baa9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:54.079862 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:54.079824 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:38:54.080051 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:54.079998 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:54.080051 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:54.080018 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:54.080051 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:54.080030 2559 projected.go:194] Error preparing data for projected volume kube-api-access-dq6rz for pod openshift-network-diagnostics/network-check-target-kddtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:54.080268 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:54.080107 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz podName:f55c507c-3b62-4dd9-9a36-9fd09f9412cb nodeName:}" failed. No retries permitted until 2026-04-24 16:38:56.080071797 +0000 UTC m=+6.294796151 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dq6rz" (UniqueName: "kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz") pod "network-check-target-kddtl" (UID: "f55c507c-3b62-4dd9-9a36-9fd09f9412cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:54.406523 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:54.406489 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:38:54.406975 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:54.406624 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:38:54.462554 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:54.462515 2559 generic.go:358] "Generic (PLEG): container finished" podID="944f81ceadc2b89c85f169eae3ca4b09" containerID="f2fa4774cfd38c8672b76ea4875158b3cf510b1cc495bdc22cfa9299339bf13d" exitCode=0 Apr 24 16:38:54.463888 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:54.463858 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" event={"ID":"944f81ceadc2b89c85f169eae3ca4b09","Type":"ContainerDied","Data":"f2fa4774cfd38c8672b76ea4875158b3cf510b1cc495bdc22cfa9299339bf13d"} Apr 24 16:38:54.480496 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:54.480437 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-179.ec2.internal" podStartSLOduration=3.480414767 podStartE2EDuration="3.480414767s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:38:53.431478445 +0000 UTC m=+3.646202823" watchObservedRunningTime="2026-04-24 16:38:54.480414767 +0000 UTC m=+4.695139145" Apr 24 16:38:55.405193 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:55.404661 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:55.405193 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:55.404808 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:38:55.496263 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:55.496226 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" event={"ID":"944f81ceadc2b89c85f169eae3ca4b09","Type":"ContainerStarted","Data":"498c86cb09cb4923166512b768a1f24cb9cf80613f46bc9908fcf1a91b39a72b"} Apr 24 16:38:55.895189 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:55.895101 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:55.895363 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:55.895244 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:55.895363 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:55.895308 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs podName:9a75b238-62f3-4139-9303-b235113baa9d nodeName:}" failed. No retries permitted until 2026-04-24 16:38:59.895289562 +0000 UTC m=+10.110013922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs") pod "network-metrics-daemon-xlwrd" (UID: "9a75b238-62f3-4139-9303-b235113baa9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:56.096604 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:56.096566 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:38:56.096761 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:56.096727 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:56.096761 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:56.096755 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:56.096927 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:56.096773 2559 projected.go:194] Error preparing data for projected volume kube-api-access-dq6rz for pod openshift-network-diagnostics/network-check-target-kddtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:56.096927 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:56.096837 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz podName:f55c507c-3b62-4dd9-9a36-9fd09f9412cb nodeName:}" failed. No retries permitted until 2026-04-24 16:39:00.096817297 +0000 UTC m=+10.311541671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dq6rz" (UniqueName: "kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz") pod "network-check-target-kddtl" (UID: "f55c507c-3b62-4dd9-9a36-9fd09f9412cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:56.417289 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:56.417252 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:38:56.417496 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:56.417436 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:38:57.404116 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:57.404053 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:57.404600 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:57.404227 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:38:58.404168 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:58.404131 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:38:58.404590 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:58.404267 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:38:59.404246 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:59.404103 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:59.404731 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:59.404254 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:38:59.931266 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:38:59.931227 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:38:59.931470 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:59.931375 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:59.931470 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:38:59.931438 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs podName:9a75b238-62f3-4139-9303-b235113baa9d nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.931419942 +0000 UTC m=+18.146144302 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs") pod "network-metrics-daemon-xlwrd" (UID: "9a75b238-62f3-4139-9303-b235113baa9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:00.132534 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:00.132486 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:00.132727 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:00.132673 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:00.132727 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:00.132698 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:00.132727 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:00.132711 2559 projected.go:194] Error preparing data for projected volume kube-api-access-dq6rz for pod openshift-network-diagnostics/network-check-target-kddtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:00.132876 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:00.132775 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz podName:f55c507c-3b62-4dd9-9a36-9fd09f9412cb nodeName:}" failed. No retries permitted until 2026-04-24 16:39:08.132754484 +0000 UTC m=+18.347478844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dq6rz" (UniqueName: "kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz") pod "network-check-target-kddtl" (UID: "f55c507c-3b62-4dd9-9a36-9fd09f9412cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:00.406074 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:00.405624 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:00.406074 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:00.405735 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:01.404383 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:01.404341 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:01.404579 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:01.404466 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:02.404843 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.404762 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:02.405306 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:02.404882 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:02.450462 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.450415 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-179.ec2.internal" podStartSLOduration=11.450396918 podStartE2EDuration="11.450396918s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:38:55.525420596 +0000 UTC m=+5.740144973" watchObservedRunningTime="2026-04-24 16:39:02.450396918 +0000 UTC m=+12.665121298" Apr 24 16:39:02.451012 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.450995 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dlfcn"] Apr 24 16:39:02.453735 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.453718 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:02.453836 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:02.453788 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:02.552140 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.552092 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e79110fd-148d-4488-a9fe-340f434292f4-kubelet-config\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:02.552288 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.552159 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e79110fd-148d-4488-a9fe-340f434292f4-dbus\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:02.552288 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.552223 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:02.653492 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.653444 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e79110fd-148d-4488-a9fe-340f434292f4-dbus\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:02.653678 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.653542 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:02.653678 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.653596 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e79110fd-148d-4488-a9fe-340f434292f4-kubelet-config\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:02.653678 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.653621 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e79110fd-148d-4488-a9fe-340f434292f4-dbus\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:02.653841 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:02.653681 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e79110fd-148d-4488-a9fe-340f434292f4-kubelet-config\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:02.653841 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:02.653712 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:02.653841 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:02.653787 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret podName:e79110fd-148d-4488-a9fe-340f434292f4 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:03.153766503 +0000 UTC m=+13.368490869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret") pod "global-pull-secret-syncer-dlfcn" (UID: "e79110fd-148d-4488-a9fe-340f434292f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:03.158147 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:03.158109 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:03.158340 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:03.158278 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:03.158401 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:03.158357 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret podName:e79110fd-148d-4488-a9fe-340f434292f4 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:04.158338016 +0000 UTC m=+14.373062384 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret") pod "global-pull-secret-syncer-dlfcn" (UID: "e79110fd-148d-4488-a9fe-340f434292f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:03.404454 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:03.404420 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:03.404605 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:03.404538 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:04.166743 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:04.166703 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:04.167216 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:04.166861 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:04.167216 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:04.166928 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret podName:e79110fd-148d-4488-a9fe-340f434292f4 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:06.166910444 +0000 UTC m=+16.381634823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret") pod "global-pull-secret-syncer-dlfcn" (UID: "e79110fd-148d-4488-a9fe-340f434292f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:04.403938 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:04.403893 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:04.404122 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:04.403894 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:04.404122 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:04.404048 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:04.404122 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:04.404115 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:05.404604 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:05.404583 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:05.404948 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:05.404703 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:06.179807 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:06.179771 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:06.180011 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:06.179932 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:06.180011 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:06.180005 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret podName:e79110fd-148d-4488-a9fe-340f434292f4 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:10.179984912 +0000 UTC m=+20.394709281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret") pod "global-pull-secret-syncer-dlfcn" (UID: "e79110fd-148d-4488-a9fe-340f434292f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:06.404134 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:06.404099 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:06.404272 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:06.404203 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:06.404272 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:06.404261 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:06.404366 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:06.404346 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:07.403877 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:07.403842 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:07.404365 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:07.403981 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:07.992197 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:07.992155 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:07.992366 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:07.992289 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:07.992366 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:07.992348 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs podName:9a75b238-62f3-4139-9303-b235113baa9d nodeName:}" failed. No retries permitted until 2026-04-24 16:39:23.992329097 +0000 UTC m=+34.207053453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs") pod "network-metrics-daemon-xlwrd" (UID: "9a75b238-62f3-4139-9303-b235113baa9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:08.193024 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:08.192989 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:08.193275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:08.193175 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:08.193275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:08.193198 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:08.193275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:08.193207 2559 projected.go:194] Error preparing data for projected volume kube-api-access-dq6rz for pod openshift-network-diagnostics/network-check-target-kddtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:08.193275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:08.193271 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz podName:f55c507c-3b62-4dd9-9a36-9fd09f9412cb nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.193250871 +0000 UTC m=+34.407975242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dq6rz" (UniqueName: "kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz") pod "network-check-target-kddtl" (UID: "f55c507c-3b62-4dd9-9a36-9fd09f9412cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:08.404523 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:08.404444 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:08.404913 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:08.404444 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:08.404913 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:08.404586 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:08.404913 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:08.404652 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:09.404686 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:09.404647 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:09.405146 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:09.404783 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:10.205051 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.204871 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:10.205213 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:10.205032 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:10.205213 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:10.205167 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret podName:e79110fd-148d-4488-a9fe-340f434292f4 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:18.205148967 +0000 UTC m=+28.419873326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret") pod "global-pull-secret-syncer-dlfcn" (UID: "e79110fd-148d-4488-a9fe-340f434292f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:10.404748 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.404580 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:10.405385 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.404646 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:10.405385 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:10.404814 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:10.405385 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:10.404937 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:10.523766 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.523734 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" event={"ID":"bce0b728-ae34-46c9-af1b-d6af180d5c23","Type":"ContainerStarted","Data":"46779285f96dc00d042e2c3d2888917bf94e0d5cac52779926c88ceb0956ed9d"} Apr 24 16:39:10.525737 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.525711 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-86w64" event={"ID":"ac5de78b-4789-4f34-988c-b131198ef66b","Type":"ContainerStarted","Data":"c4bc2a492404b2389b929bd6a7484a941432f5f32efa194ab3f3effdb2217b3e"} Apr 24 16:39:10.528391 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.528374 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:39:10.528806 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.528710 2559 generic.go:358] "Generic (PLEG): container finished" podID="4d355f77-a36f-48e8-a168-c520805efc91" containerID="4a878bd8c6aaa7c818d3edf1e8defac26832ea8663f6fcd4da2f4135d913e155" exitCode=1 Apr 24 16:39:10.528806 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.528749 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerStarted","Data":"ed2205cc4c8201c631aed395154d64fbe76e33c437e823f79771eeb2002d9996"} Apr 24 16:39:10.528806 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.528781 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerStarted","Data":"fb8632590fe0d8b41c8e0a2562476e9af9003c22a797746d9f18bfd5cf1e871f"} Apr 24 16:39:10.528806 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.528794 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerStarted","Data":"4fed19e9d1aaa3bccde6c77ecd5d7f19ebc3b2b434cd1693db207ed9c156925c"} Apr 24 16:39:10.528806 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.528809 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerDied","Data":"4a878bd8c6aaa7c818d3edf1e8defac26832ea8663f6fcd4da2f4135d913e155"} Apr 24 16:39:10.529101 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.528824 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerStarted","Data":"b3a91bbe572abcfe5ed88e746133e205335349f7f2d1badd51dfebd5722d6aa2"} Apr 24 16:39:10.530067 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.530048 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-47g4c" event={"ID":"d5140d31-beb5-42da-bdf5-60b7bfc41f79","Type":"ContainerStarted","Data":"4b11ded12fdf0c5d92dd1910eab761e315db74dc3d0a02722b54e2c24e80dcdf"} Apr 24 16:39:10.531451 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.531428 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v6j2z" event={"ID":"7282f227-12bf-4645-b003-1131e4895ca0","Type":"ContainerStarted","Data":"be198df96938ad1090b0c70efda63794ecf02172e8dbe5e87060123d6d518cf8"} Apr 24 16:39:10.532743 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.532716 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" event={"ID":"513c5343-09a1-48a6-8da2-774588de4d58","Type":"ContainerStarted","Data":"b415e0c8f96f5dced819f0b4308fe5ba65455e29a41917b3d480a94c16a6a08d"} Apr 24 16:39:10.534496 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.534462 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7t5b9" event={"ID":"eec44da0-010a-4044-830b-8f9776a91747","Type":"ContainerStarted","Data":"580f5d308bffbf42b079fdafc379a946979a8533abbbe02bf481a3bf3ddc3f5c"} Apr 24 16:39:10.535915 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.535892 2559 generic.go:358] "Generic (PLEG): container finished" podID="af7a310a-32d2-48cf-b2d6-69c07daae4b0" containerID="0b2d28f2ab1021a3190b05fe6a935ca9202d36605918bc12ca2ee5fcfa991581" exitCode=0 Apr 24 16:39:10.535986 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.535927 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" event={"ID":"af7a310a-32d2-48cf-b2d6-69c07daae4b0","Type":"ContainerDied","Data":"0b2d28f2ab1021a3190b05fe6a935ca9202d36605918bc12ca2ee5fcfa991581"} Apr 24 16:39:10.542521 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.542483 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-86w64" podStartSLOduration=8.139281243 podStartE2EDuration="20.542468305s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.99154858 +0000 UTC m=+3.206272934" lastFinishedPulling="2026-04-24 16:39:05.394735628 +0000 UTC m=+15.609459996" observedRunningTime="2026-04-24 16:39:10.541771442 +0000 UTC m=+20.756495821" watchObservedRunningTime="2026-04-24 16:39:10.542468305 +0000 UTC m=+20.757192682" Apr 24 16:39:10.613766 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.613701 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lkl8j" podStartSLOduration=3.825033039 podStartE2EDuration="20.613683216s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.99846862 +0000 UTC m=+3.213192976" lastFinishedPulling="2026-04-24 16:39:09.787118785 +0000 UTC m=+20.001843153" observedRunningTime="2026-04-24 16:39:10.588012445 +0000 UTC m=+20.802736823" watchObservedRunningTime="2026-04-24 16:39:10.613683216 +0000 UTC m=+20.828407591" Apr 24 16:39:10.614268 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.614228 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-47g4c" podStartSLOduration=3.819769156 podStartE2EDuration="20.614220168s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:53.000753208 +0000 UTC m=+3.215477586" lastFinishedPulling="2026-04-24 16:39:09.795204244 +0000 UTC m=+20.009928598" observedRunningTime="2026-04-24 16:39:10.61375427 +0000 UTC m=+20.828478649" watchObservedRunningTime="2026-04-24 16:39:10.614220168 +0000 UTC m=+20.828944544" Apr 24 16:39:10.630339 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.630287 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7t5b9" podStartSLOduration=3.91724493 podStartE2EDuration="20.630273576s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.997338661 +0000 UTC m=+3.212063030" lastFinishedPulling="2026-04-24 16:39:09.710367312 +0000 UTC m=+19.925091676" observedRunningTime="2026-04-24 16:39:10.629292554 +0000 UTC m=+20.844016929" watchObservedRunningTime="2026-04-24 16:39:10.630273576 +0000 UTC m=+20.844997952" Apr 24 16:39:10.646837 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.646783 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v6j2z" podStartSLOduration=3.936298592 podStartE2EDuration="20.646768174s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.999769578 +0000 UTC m=+3.214493934" lastFinishedPulling="2026-04-24 16:39:09.710239163 +0000 UTC m=+19.924963516" observedRunningTime="2026-04-24 16:39:10.645888985 +0000 UTC m=+20.860613360" watchObservedRunningTime="2026-04-24 16:39:10.646768174 +0000 UTC m=+20.861492549" Apr 24 16:39:10.921155 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.921122 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-86w64" Apr 24 16:39:10.921739 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:10.921714 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-86w64" Apr 24 16:39:11.403876 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:11.403848 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:11.404033 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:11.403981 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:11.422978 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:11.422952 2559 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:39:11.541272 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:11.541241 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:39:11.541634 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:11.541611 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerStarted","Data":"29862af848e27017606aef41358875b7f279f452778046ba8f4d929cb531bfd9"} Apr 24 16:39:11.542938 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:11.542900 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fg8mz" event={"ID":"3cf6c854-7267-4e50-9e1d-4fdd84304910","Type":"ContainerStarted","Data":"2ad43b80e4ab965905b8bb5a595432cfbe6231e658e0b17b6de9103fef121d76"} Apr 24 16:39:11.544875 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:11.544844 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" event={"ID":"bce0b728-ae34-46c9-af1b-d6af180d5c23","Type":"ContainerStarted","Data":"208f35e6e1f0fe43c08694291f7325ea0d5880fdb06d5af1c0f8b8da839c5886"} Apr 24 16:39:11.559152 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:11.559111 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fg8mz" podStartSLOduration=4.799345246 podStartE2EDuration="21.559097529s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.998696581 +0000 UTC m=+3.213420940" lastFinishedPulling="2026-04-24 16:39:09.75844886 +0000 UTC m=+19.973173223" observedRunningTime="2026-04-24 16:39:11.558605248 +0000 UTC m=+21.773329623" watchObservedRunningTime="2026-04-24 16:39:11.559097529 +0000 UTC m=+21.773821899" Apr 24 16:39:12.111805 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:12.111767 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-86w64" Apr 24 16:39:12.119126 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:12.119097 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-86w64" Apr 24 16:39:12.342542 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:12.342429 2559 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:39:11.422973797Z","UUID":"04a3bf1a-b813-4e17-a40e-358457b07a9b","Handler":null,"Name":"","Endpoint":""} Apr 24 16:39:12.345603 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:12.345579 2559 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:39:12.345603 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:12.345605 2559 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:39:12.403811 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:12.403698 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:12.403811 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:12.403696 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:12.403990 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:12.403825 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:12.403990 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:12.403892 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:12.549136 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:12.549102 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" event={"ID":"bce0b728-ae34-46c9-af1b-d6af180d5c23","Type":"ContainerStarted","Data":"106966f69da534c6d35518e9176b3003d3301032a297ff6ab592480fb23056cd"} Apr 24 16:39:13.404760 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:13.404587 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:13.404906 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:13.404840 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:13.553745 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:13.553718 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:39:13.554169 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:13.554109 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerStarted","Data":"3cfff11eb5e3a028be2235591181f721b2602eb4baf3e16cc7225cd231ddf70a"} Apr 24 16:39:14.404606 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:14.404569 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:14.404803 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:14.404698 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:14.404803 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:14.404748 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:14.404923 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:14.404869 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:15.404965 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:15.404678 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:15.405788 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:15.404989 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:15.562841 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:15.562814 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:39:15.563199 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:15.563166 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerStarted","Data":"877a7fa15ffa1b456a812fc71824de377adb8d6179dd8f3b2780ef28027ded9b"} Apr 24 16:39:15.563522 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:15.563497 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:39:15.563690 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:15.563673 2559 scope.go:117] "RemoveContainer" containerID="4a878bd8c6aaa7c818d3edf1e8defac26832ea8663f6fcd4da2f4135d913e155" Apr 24 16:39:15.564810 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:15.564789 2559 generic.go:358] "Generic (PLEG): container finished" podID="af7a310a-32d2-48cf-b2d6-69c07daae4b0" containerID="446ae0ad4f528380aabb442bedabfa8fa2682e26877041a8c0508ba090ff8688" exitCode=0 Apr 24 16:39:15.564903 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:15.564828 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" event={"ID":"af7a310a-32d2-48cf-b2d6-69c07daae4b0","Type":"ContainerDied","Data":"446ae0ad4f528380aabb442bedabfa8fa2682e26877041a8c0508ba090ff8688"} Apr 24 16:39:15.579728 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:15.579707 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:39:15.592175 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:15.592131 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r98c9" podStartSLOduration=6.179832146 podStartE2EDuration="25.592117227s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.994548468 +0000 UTC m=+3.209272823" lastFinishedPulling="2026-04-24 16:39:12.406833551 +0000 UTC m=+22.621557904" observedRunningTime="2026-04-24 16:39:13.578436919 +0000 UTC m=+23.793161296" watchObservedRunningTime="2026-04-24 16:39:15.592117227 +0000 UTC m=+25.806841602" Apr 24 16:39:16.404202 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.404167 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:16.404353 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.404209 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:16.404353 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:16.404295 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:16.404446 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:16.404426 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:16.568649 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.568569 2559 generic.go:358] "Generic (PLEG): container finished" podID="af7a310a-32d2-48cf-b2d6-69c07daae4b0" containerID="50749a10f4bff06e0b273ae877f3bdad1a93d11e62c2e58248c29c331af5987a" exitCode=0 Apr 24 16:39:16.569037 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.568648 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" event={"ID":"af7a310a-32d2-48cf-b2d6-69c07daae4b0","Type":"ContainerDied","Data":"50749a10f4bff06e0b273ae877f3bdad1a93d11e62c2e58248c29c331af5987a"} Apr 24 16:39:16.572014 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.571995 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:39:16.572447 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.572426 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" event={"ID":"4d355f77-a36f-48e8-a168-c520805efc91","Type":"ContainerStarted","Data":"6946bad4c68fdc0e27af9efd80743e936178b84ef0137c74250d88f5476834b6"} Apr 24 16:39:16.572812 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.572793 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:39:16.572898 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.572818 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:39:16.586933 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.586910 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:39:16.620881 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.620830 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" podStartSLOduration=9.744348719 podStartE2EDuration="26.620816923s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:53.00204032 +0000 UTC m=+3.216764675" lastFinishedPulling="2026-04-24 16:39:09.878508521 +0000 UTC m=+20.093232879" observedRunningTime="2026-04-24 16:39:16.619852983 +0000 UTC m=+26.834577360" watchObservedRunningTime="2026-04-24 16:39:16.620816923 +0000 UTC m=+26.835541299" Apr 24 16:39:16.950205 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.950176 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kddtl"] Apr 24 16:39:16.950379 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.950269 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:16.950379 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:16.950343 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:16.953343 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.953319 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dlfcn"] Apr 24 16:39:16.953474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.953408 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:16.953533 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:16.953507 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:16.954003 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.953980 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xlwrd"] Apr 24 16:39:16.954136 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:16.954120 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:16.954253 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:16.954217 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:17.576731 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:17.576648 2559 generic.go:358] "Generic (PLEG): container finished" podID="af7a310a-32d2-48cf-b2d6-69c07daae4b0" containerID="fcb41888b496192fd2af6f6779962c6c3ad5bcd47750e4a8fa602c761f088c89" exitCode=0 Apr 24 16:39:17.577114 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:17.576730 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" event={"ID":"af7a310a-32d2-48cf-b2d6-69c07daae4b0","Type":"ContainerDied","Data":"fcb41888b496192fd2af6f6779962c6c3ad5bcd47750e4a8fa602c761f088c89"} Apr 24 16:39:18.268303 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:18.268269 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:18.268489 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:18.268446 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:18.268536 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:18.268527 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret podName:e79110fd-148d-4488-a9fe-340f434292f4 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:34.268504771 +0000 UTC m=+44.483229139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret") pod "global-pull-secret-syncer-dlfcn" (UID: "e79110fd-148d-4488-a9fe-340f434292f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:18.404544 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:18.404508 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:18.404735 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:18.404508 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:18.404735 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:18.404634 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:18.404735 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:18.404519 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:18.404962 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:18.404734 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:18.404962 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:18.404822 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:20.405383 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:20.405179 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:20.405903 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:20.405254 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:20.405903 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:20.405455 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:20.405903 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:20.405558 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:20.405903 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:20.405274 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:20.405903 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:20.405643 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:22.404287 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.404250 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:22.405013 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.404398 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:22.405013 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.404515 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlfcn" podUID="e79110fd-148d-4488-a9fe-340f434292f4" Apr 24 16:39:22.405013 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.404400 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:39:22.405013 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.404562 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:22.405013 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.404627 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kddtl" podUID="f55c507c-3b62-4dd9-9a36-9fd09f9412cb" Apr 24 16:39:22.602067 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.602035 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-179.ec2.internal" event="NodeReady" Apr 24 16:39:22.602246 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.602196 2559 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:39:22.639479 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.639338 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-78f7696f44-4cxsn"] Apr 24 16:39:22.644806 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.644771 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.650115 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.650066 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 16:39:22.650272 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.650104 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 16:39:22.651952 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.651812 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dphxj"] Apr 24 16:39:22.653276 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.653176 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6j55m\"" Apr 24 16:39:22.653734 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.653713 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 16:39:22.655574 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.655477 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:22.658903 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.658749 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:39:22.658903 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.658778 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lhhc7\"" Apr 24 16:39:22.659229 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.659210 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:39:22.659449 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.659426 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:39:22.659814 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.659794 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-78f7696f44-4cxsn"] Apr 24 16:39:22.660342 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.660323 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 16:39:22.664685 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.664660 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dphxj"] Apr 24 16:39:22.669965 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.669944 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-884bg"] Apr 24 16:39:22.673569 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.673545 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.676128 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.676105 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:39:22.676272 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.676158 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:39:22.676336 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.676281 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzj8j\"" Apr 24 16:39:22.686588 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.686562 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-884bg"] Apr 24 16:39:22.804828 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.804793 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10f41d61-9007-409b-a71a-b7938df11802-ca-trust-extracted\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.805023 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.804861 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5qf\" (UniqueName: \"kubernetes.io/projected/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-kube-api-access-8v5qf\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.805023 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.804919 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.805023 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.804955 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-tmp-dir\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.805023 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.804979 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:22.805023 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.805009 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-installation-pull-secrets\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.805251 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.805036 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbjkn\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-kube-api-access-pbjkn\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.805251 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.805070 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-config-volume\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.805251 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.805103 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.805251 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.805148 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-registry-certificates\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.805251 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.805168 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dfvt\" (UniqueName: \"kubernetes.io/projected/c579f8b7-4799-4d6b-8770-b69441d1c6b7-kube-api-access-7dfvt\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:22.805251 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.805194 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-trusted-ca\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.805251 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.805219 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-bound-sa-token\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.805520 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.805270 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-image-registry-private-configuration\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.905784 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.905697 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-config-volume\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.905784 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.905738 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.905784 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.905772 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-registry-certificates\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.906062 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.905800 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dfvt\" (UniqueName: \"kubernetes.io/projected/c579f8b7-4799-4d6b-8770-b69441d1c6b7-kube-api-access-7dfvt\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:22.906062 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.905832 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-trusted-ca\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.906062 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.905855 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-bound-sa-token\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.906062 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.905880 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:22.906062 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.905903 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78f7696f44-4cxsn: secret "image-registry-tls" not found Apr 24 16:39:22.906062 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.905963 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls podName:10f41d61-9007-409b-a71a-b7938df11802 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:23.405943156 +0000 UTC m=+33.620667510 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls") pod "image-registry-78f7696f44-4cxsn" (UID: "10f41d61-9007-409b-a71a-b7938df11802") : secret "image-registry-tls" not found Apr 24 16:39:22.906394 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.905889 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-image-registry-private-configuration\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.906394 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.906321 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10f41d61-9007-409b-a71a-b7938df11802-ca-trust-extracted\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.906394 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.906372 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5qf\" (UniqueName: \"kubernetes.io/projected/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-kube-api-access-8v5qf\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.906547 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.906403 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-config-volume\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.906547 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.906416 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.906547 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.906444 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-tmp-dir\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.906547 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.906468 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:22.906547 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.906494 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-installation-pull-secrets\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.906547 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.906521 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbjkn\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-kube-api-access-pbjkn\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.907000 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.906970 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-registry-certificates\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.907136 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.907075 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10f41d61-9007-409b-a71a-b7938df11802-ca-trust-extracted\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.907204 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.907190 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:22.907264 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.907216 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-tmp-dir\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.907264 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.907240 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert podName:c579f8b7-4799-4d6b-8770-b69441d1c6b7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:23.407222924 +0000 UTC m=+33.621947292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert") pod "ingress-canary-dphxj" (UID: "c579f8b7-4799-4d6b-8770-b69441d1c6b7") : secret "canary-serving-cert" not found Apr 24 16:39:22.907369 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.907347 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:22.907418 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:22.907399 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls podName:641dddb3-e48f-4ac9-9fad-88baa8d3cd29 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:23.407380425 +0000 UTC m=+33.622104787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls") pod "dns-default-884bg" (UID: "641dddb3-e48f-4ac9-9fad-88baa8d3cd29") : secret "dns-default-metrics-tls" not found Apr 24 16:39:22.908763 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.908731 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-trusted-ca\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.911096 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.911057 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-image-registry-private-configuration\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.911215 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.911063 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-installation-pull-secrets\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.919927 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.919858 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbjkn\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-kube-api-access-pbjkn\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:22.920709 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.920660 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5qf\" (UniqueName: \"kubernetes.io/projected/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-kube-api-access-8v5qf\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:22.921030 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.921003 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dfvt\" (UniqueName: \"kubernetes.io/projected/c579f8b7-4799-4d6b-8770-b69441d1c6b7-kube-api-access-7dfvt\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:22.921206 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:22.921185 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-bound-sa-token\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:23.410156 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:23.410113 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:23.410156 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:23.410156 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:23.410879 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:23.410206 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:23.410879 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:23.410288 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:23.410879 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:23.410316 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:23.410879 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:23.410329 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78f7696f44-4cxsn: secret "image-registry-tls" not found Apr 24 16:39:23.410879 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:23.410370 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:23.410879 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:23.410380 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls podName:10f41d61-9007-409b-a71a-b7938df11802 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.410363237 +0000 UTC m=+34.625087592 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls") pod "image-registry-78f7696f44-4cxsn" (UID: "10f41d61-9007-409b-a71a-b7938df11802") : secret "image-registry-tls" not found Apr 24 16:39:23.410879 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:23.410468 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls podName:641dddb3-e48f-4ac9-9fad-88baa8d3cd29 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.410440062 +0000 UTC m=+34.625164421 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls") pod "dns-default-884bg" (UID: "641dddb3-e48f-4ac9-9fad-88baa8d3cd29") : secret "dns-default-metrics-tls" not found Apr 24 16:39:23.410879 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:23.410484 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert podName:c579f8b7-4799-4d6b-8770-b69441d1c6b7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.410476545 +0000 UTC m=+34.625200901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert") pod "ingress-canary-dphxj" (UID: "c579f8b7-4799-4d6b-8770-b69441d1c6b7") : secret "canary-serving-cert" not found Apr 24 16:39:24.015861 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.015822 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:24.016008 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.015975 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:24.016064 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.016045 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs podName:9a75b238-62f3-4139-9303-b235113baa9d nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.016027766 +0000 UTC m=+66.230752141 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs") pod "network-metrics-daemon-xlwrd" (UID: "9a75b238-62f3-4139-9303-b235113baa9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:24.218217 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.218130 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:24.218364 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.218299 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:24.218364 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.218319 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:24.218364 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.218329 2559 projected.go:194] Error preparing data for projected volume kube-api-access-dq6rz for pod openshift-network-diagnostics/network-check-target-kddtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:24.218459 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.218378 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz podName:f55c507c-3b62-4dd9-9a36-9fd09f9412cb nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.218364379 +0000 UTC m=+66.433088739 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dq6rz" (UniqueName: "kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz") pod "network-check-target-kddtl" (UID: "f55c507c-3b62-4dd9-9a36-9fd09f9412cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:24.404477 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.404443 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:24.404477 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.404478 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:24.404687 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.404494 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:24.407167 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.407139 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:24.407321 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.407144 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rvcx9\"" Apr 24 16:39:24.407321 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.407146 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zln8m\"" Apr 24 16:39:24.407321 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.407149 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.407321 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.407292 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.407321 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.407190 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:39:24.419799 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.419778 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:24.420188 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.419804 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:24.420188 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.419830 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:24.420188 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.419905 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:24.420188 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.419917 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:24.420188 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.419927 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78f7696f44-4cxsn: secret "image-registry-tls" not found Apr 24 16:39:24.420188 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.419926 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:24.420188 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.419972 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls podName:641dddb3-e48f-4ac9-9fad-88baa8d3cd29 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:26.419955687 +0000 UTC m=+36.634680040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls") pod "dns-default-884bg" (UID: "641dddb3-e48f-4ac9-9fad-88baa8d3cd29") : secret "dns-default-metrics-tls" not found Apr 24 16:39:24.420188 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.419985 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert podName:c579f8b7-4799-4d6b-8770-b69441d1c6b7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:26.419979095 +0000 UTC m=+36.634703451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert") pod "ingress-canary-dphxj" (UID: "c579f8b7-4799-4d6b-8770-b69441d1c6b7") : secret "canary-serving-cert" not found Apr 24 16:39:24.420188 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:24.419994 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls podName:10f41d61-9007-409b-a71a-b7938df11802 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:26.419989835 +0000 UTC m=+36.634714192 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls") pod "image-registry-78f7696f44-4cxsn" (UID: "10f41d61-9007-409b-a71a-b7938df11802") : secret "image-registry-tls" not found Apr 24 16:39:24.591743 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.591657 2559 generic.go:358] "Generic (PLEG): container finished" podID="af7a310a-32d2-48cf-b2d6-69c07daae4b0" containerID="87fe5c340e4b2289e3d37925534a758b4766d85a0e706af4ca546fda8e8833ed" exitCode=0 Apr 24 16:39:24.591743 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:24.591708 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" event={"ID":"af7a310a-32d2-48cf-b2d6-69c07daae4b0","Type":"ContainerDied","Data":"87fe5c340e4b2289e3d37925534a758b4766d85a0e706af4ca546fda8e8833ed"} Apr 24 16:39:25.595318 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:25.595285 2559 generic.go:358] "Generic (PLEG): container finished" podID="af7a310a-32d2-48cf-b2d6-69c07daae4b0" containerID="f4e8e38d179042933c83a6540318cf3469efc328c592627e07980d9d483e21e1" exitCode=0 Apr 24 16:39:25.595674 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:25.595330 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" event={"ID":"af7a310a-32d2-48cf-b2d6-69c07daae4b0","Type":"ContainerDied","Data":"f4e8e38d179042933c83a6540318cf3469efc328c592627e07980d9d483e21e1"} Apr 24 16:39:26.438423 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:26.438230 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:26.438595 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:26.438434 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:26.438595 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:26.438475 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:26.438595 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:26.438380 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:26.438595 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:26.438545 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls podName:641dddb3-e48f-4ac9-9fad-88baa8d3cd29 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:30.438528771 +0000 UTC m=+40.653253130 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls") pod "dns-default-884bg" (UID: "641dddb3-e48f-4ac9-9fad-88baa8d3cd29") : secret "dns-default-metrics-tls" not found Apr 24 16:39:26.438595 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:26.438577 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:26.438595 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:26.438585 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:26.438595 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:26.438597 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78f7696f44-4cxsn: secret "image-registry-tls" not found Apr 24 16:39:26.438821 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:26.438632 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert podName:c579f8b7-4799-4d6b-8770-b69441d1c6b7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:30.438616374 +0000 UTC m=+40.653340747 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert") pod "ingress-canary-dphxj" (UID: "c579f8b7-4799-4d6b-8770-b69441d1c6b7") : secret "canary-serving-cert" not found Apr 24 16:39:26.438821 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:26.438647 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls podName:10f41d61-9007-409b-a71a-b7938df11802 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:30.438641054 +0000 UTC m=+40.653365412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls") pod "image-registry-78f7696f44-4cxsn" (UID: "10f41d61-9007-409b-a71a-b7938df11802") : secret "image-registry-tls" not found Apr 24 16:39:26.600342 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:26.600306 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" event={"ID":"af7a310a-32d2-48cf-b2d6-69c07daae4b0","Type":"ContainerStarted","Data":"224b66cb2bca8efb9e138bb5299fa34f5b25d424c73c0d901c001b9305e56254"} Apr 24 16:39:26.627341 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:26.627298 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8gmjx" podStartSLOduration=5.976426781 podStartE2EDuration="36.627285053s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.996148619 +0000 UTC m=+3.210872987" lastFinishedPulling="2026-04-24 16:39:23.6470069 +0000 UTC m=+33.861731259" observedRunningTime="2026-04-24 16:39:26.62303837 +0000 UTC m=+36.837762746" watchObservedRunningTime="2026-04-24 16:39:26.627285053 +0000 UTC m=+36.842009429" Apr 24 16:39:30.471542 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:30.471504 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:30.471542 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:30.471546 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:30.471992 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:30.471590 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:30.471992 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:30.471655 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:30.471992 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:30.471731 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:30.471992 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:30.471758 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls podName:641dddb3-e48f-4ac9-9fad-88baa8d3cd29 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:38.471742233 +0000 UTC m=+48.686466588 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls") pod "dns-default-884bg" (UID: "641dddb3-e48f-4ac9-9fad-88baa8d3cd29") : secret "dns-default-metrics-tls" not found Apr 24 16:39:30.471992 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:30.471784 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert podName:c579f8b7-4799-4d6b-8770-b69441d1c6b7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:38.471771474 +0000 UTC m=+48.686495829 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert") pod "ingress-canary-dphxj" (UID: "c579f8b7-4799-4d6b-8770-b69441d1c6b7") : secret "canary-serving-cert" not found Apr 24 16:39:30.471992 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:30.471737 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:30.471992 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:30.471800 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78f7696f44-4cxsn: secret "image-registry-tls" not found Apr 24 16:39:30.471992 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:30.471836 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls podName:10f41d61-9007-409b-a71a-b7938df11802 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:38.471825252 +0000 UTC m=+48.686549619 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls") pod "image-registry-78f7696f44-4cxsn" (UID: "10f41d61-9007-409b-a71a-b7938df11802") : secret "image-registry-tls" not found Apr 24 16:39:34.300531 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:34.300489 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:34.303601 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:34.303575 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e79110fd-148d-4488-a9fe-340f434292f4-original-pull-secret\") pod \"global-pull-secret-syncer-dlfcn\" (UID: \"e79110fd-148d-4488-a9fe-340f434292f4\") " pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:34.324870 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:34.324847 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlfcn" Apr 24 16:39:34.513368 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:34.513337 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dlfcn"] Apr 24 16:39:34.518931 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:39:34.518905 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode79110fd_148d_4488_a9fe_340f434292f4.slice/crio-5676687a83f817a79646bdbdb622fef1df1f0c52c245b18161af13c7cbf3a95a WatchSource:0}: Error finding container 5676687a83f817a79646bdbdb622fef1df1f0c52c245b18161af13c7cbf3a95a: Status 404 returned error can't find the container with id 5676687a83f817a79646bdbdb622fef1df1f0c52c245b18161af13c7cbf3a95a Apr 24 16:39:34.615628 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:34.615543 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dlfcn" event={"ID":"e79110fd-148d-4488-a9fe-340f434292f4","Type":"ContainerStarted","Data":"5676687a83f817a79646bdbdb622fef1df1f0c52c245b18161af13c7cbf3a95a"} Apr 24 16:39:38.533737 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:38.533646 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:38.533737 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:38.533688 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:38.533737 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:38.533728 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:38.534275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:38.533780 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:38.534275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:38.533849 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls podName:641dddb3-e48f-4ac9-9fad-88baa8d3cd29 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:54.533833193 +0000 UTC m=+64.748557550 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls") pod "dns-default-884bg" (UID: "641dddb3-e48f-4ac9-9fad-88baa8d3cd29") : secret "dns-default-metrics-tls" not found Apr 24 16:39:38.534275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:38.533854 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:38.534275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:38.533867 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78f7696f44-4cxsn: secret "image-registry-tls" not found Apr 24 16:39:38.534275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:38.533872 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:38.534275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:38.533907 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls podName:10f41d61-9007-409b-a71a-b7938df11802 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:54.533897061 +0000 UTC m=+64.748621428 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls") pod "image-registry-78f7696f44-4cxsn" (UID: "10f41d61-9007-409b-a71a-b7938df11802") : secret "image-registry-tls" not found Apr 24 16:39:38.534275 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:38.533944 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert podName:c579f8b7-4799-4d6b-8770-b69441d1c6b7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:54.533912514 +0000 UTC m=+64.748636889 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert") pod "ingress-canary-dphxj" (UID: "c579f8b7-4799-4d6b-8770-b69441d1c6b7") : secret "canary-serving-cert" not found Apr 24 16:39:38.624469 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:38.624429 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dlfcn" event={"ID":"e79110fd-148d-4488-a9fe-340f434292f4","Type":"ContainerStarted","Data":"e11e8d77df03416303af95e8bb3b987723682d5de01c5053a93a0b8f1f6c8c8f"} Apr 24 16:39:38.647099 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:38.647032 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dlfcn" podStartSLOduration=32.906225373 podStartE2EDuration="36.647018009s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:34.521428182 +0000 UTC m=+44.736152535" lastFinishedPulling="2026-04-24 16:39:38.262220803 +0000 UTC m=+48.476945171" observedRunningTime="2026-04-24 16:39:38.64698245 +0000 UTC m=+48.861706828" watchObservedRunningTime="2026-04-24 16:39:38.647018009 +0000 UTC m=+48.861742376" Apr 24 16:39:48.589724 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:48.589692 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f9px" Apr 24 16:39:54.120886 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.120851 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4"] Apr 24 16:39:54.125218 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.125199 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.127516 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.127497 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 16:39:54.127621 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.127563 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 16:39:54.128501 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.128483 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 16:39:54.128601 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.128506 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 16:39:54.132910 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.132891 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4"] Apr 24 16:39:54.138502 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.138483 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rzzx\" (UniqueName: \"kubernetes.io/projected/e4430698-7f43-418f-ab61-89e6da988f97-kube-api-access-4rzzx\") pod \"klusterlet-addon-workmgr-8554f567b6-g7nv4\" (UID: \"e4430698-7f43-418f-ab61-89e6da988f97\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.138580 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.138546 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e4430698-7f43-418f-ab61-89e6da988f97-klusterlet-config\") pod \"klusterlet-addon-workmgr-8554f567b6-g7nv4\" (UID: \"e4430698-7f43-418f-ab61-89e6da988f97\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.138620 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.138586 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4430698-7f43-418f-ab61-89e6da988f97-tmp\") pod \"klusterlet-addon-workmgr-8554f567b6-g7nv4\" (UID: \"e4430698-7f43-418f-ab61-89e6da988f97\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.239399 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.239364 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e4430698-7f43-418f-ab61-89e6da988f97-klusterlet-config\") pod \"klusterlet-addon-workmgr-8554f567b6-g7nv4\" (UID: \"e4430698-7f43-418f-ab61-89e6da988f97\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.239399 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.239402 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4430698-7f43-418f-ab61-89e6da988f97-tmp\") pod \"klusterlet-addon-workmgr-8554f567b6-g7nv4\" (UID: \"e4430698-7f43-418f-ab61-89e6da988f97\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.239598 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.239451 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rzzx\" (UniqueName: \"kubernetes.io/projected/e4430698-7f43-418f-ab61-89e6da988f97-kube-api-access-4rzzx\") pod \"klusterlet-addon-workmgr-8554f567b6-g7nv4\" (UID: \"e4430698-7f43-418f-ab61-89e6da988f97\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.239827 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.239808 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4430698-7f43-418f-ab61-89e6da988f97-tmp\") pod \"klusterlet-addon-workmgr-8554f567b6-g7nv4\" (UID: \"e4430698-7f43-418f-ab61-89e6da988f97\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.241796 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.241777 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e4430698-7f43-418f-ab61-89e6da988f97-klusterlet-config\") pod \"klusterlet-addon-workmgr-8554f567b6-g7nv4\" (UID: \"e4430698-7f43-418f-ab61-89e6da988f97\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.247287 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.247265 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rzzx\" (UniqueName: \"kubernetes.io/projected/e4430698-7f43-418f-ab61-89e6da988f97-kube-api-access-4rzzx\") pod \"klusterlet-addon-workmgr-8554f567b6-g7nv4\" (UID: \"e4430698-7f43-418f-ab61-89e6da988f97\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.434730 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.434691 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:54.542722 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.542691 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:39:54.542722 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.542725 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:39:54.542955 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.542755 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:39:54.542955 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:54.542849 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:54.542955 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:54.542908 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:54.542955 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:54.542919 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78f7696f44-4cxsn: secret "image-registry-tls" not found Apr 24 16:39:54.542955 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:54.542931 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls podName:641dddb3-e48f-4ac9-9fad-88baa8d3cd29 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:26.542910315 +0000 UTC m=+96.757634681 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls") pod "dns-default-884bg" (UID: "641dddb3-e48f-4ac9-9fad-88baa8d3cd29") : secret "dns-default-metrics-tls" not found Apr 24 16:39:54.542955 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:54.542954 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls podName:10f41d61-9007-409b-a71a-b7938df11802 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:26.542943873 +0000 UTC m=+96.757668227 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls") pod "image-registry-78f7696f44-4cxsn" (UID: "10f41d61-9007-409b-a71a-b7938df11802") : secret "image-registry-tls" not found Apr 24 16:39:54.542955 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:54.542847 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:54.543299 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:54.542990 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert podName:c579f8b7-4799-4d6b-8770-b69441d1c6b7 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:26.542982588 +0000 UTC m=+96.757706967 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert") pod "ingress-canary-dphxj" (UID: "c579f8b7-4799-4d6b-8770-b69441d1c6b7") : secret "canary-serving-cert" not found Apr 24 16:39:54.546893 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.546870 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4"] Apr 24 16:39:54.550753 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:39:54.550734 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4430698_7f43_418f_ab61_89e6da988f97.slice/crio-1f189421df3daa67b3aea230e2af5ac44f2cf31d38083e08eae203bc87783d42 WatchSource:0}: Error finding container 1f189421df3daa67b3aea230e2af5ac44f2cf31d38083e08eae203bc87783d42: Status 404 returned error can't find the container with id 1f189421df3daa67b3aea230e2af5ac44f2cf31d38083e08eae203bc87783d42 Apr 24 16:39:54.657687 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:54.657655 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" event={"ID":"e4430698-7f43-418f-ab61-89e6da988f97","Type":"ContainerStarted","Data":"1f189421df3daa67b3aea230e2af5ac44f2cf31d38083e08eae203bc87783d42"} Apr 24 16:39:56.054883 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:56.054835 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:39:56.057879 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:56.057824 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:56.065297 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:56.065272 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:39:56.065453 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:39:56.065348 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs podName:9a75b238-62f3-4139-9303-b235113baa9d nodeName:}" failed. No retries permitted until 2026-04-24 16:41:00.065327263 +0000 UTC m=+130.280051623 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs") pod "network-metrics-daemon-xlwrd" (UID: "9a75b238-62f3-4139-9303-b235113baa9d") : secret "metrics-daemon-secret" not found Apr 24 16:39:56.257251 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:56.257210 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:56.260021 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:56.259996 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:56.270016 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:56.269987 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:56.280734 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:56.280705 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq6rz\" (UniqueName: \"kubernetes.io/projected/f55c507c-3b62-4dd9-9a36-9fd09f9412cb-kube-api-access-dq6rz\") pod \"network-check-target-kddtl\" (UID: \"f55c507c-3b62-4dd9-9a36-9fd09f9412cb\") " pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:56.522415 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:56.522370 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rvcx9\"" Apr 24 16:39:56.530919 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:56.530896 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:39:58.217851 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:58.217803 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kddtl"] Apr 24 16:39:58.221680 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:39:58.221650 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55c507c_3b62_4dd9_9a36_9fd09f9412cb.slice/crio-76209393e7f9cb89a219f482016139fe2355f7e64553d9b2da1c13467d6e4bac WatchSource:0}: Error finding container 76209393e7f9cb89a219f482016139fe2355f7e64553d9b2da1c13467d6e4bac: Status 404 returned error can't find the container with id 76209393e7f9cb89a219f482016139fe2355f7e64553d9b2da1c13467d6e4bac Apr 24 16:39:58.668487 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:58.668371 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" event={"ID":"e4430698-7f43-418f-ab61-89e6da988f97","Type":"ContainerStarted","Data":"975b689a2c18d2dd67a94db0f26a4d5fa9e318896dbdeb52210f1bcbbf752039"} Apr 24 16:39:58.669051 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:58.669026 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:58.670648 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:58.670532 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" Apr 24 16:39:58.671560 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:58.671522 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kddtl" event={"ID":"f55c507c-3b62-4dd9-9a36-9fd09f9412cb","Type":"ContainerStarted","Data":"76209393e7f9cb89a219f482016139fe2355f7e64553d9b2da1c13467d6e4bac"} Apr 24 16:39:58.689311 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:39:58.689252 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8554f567b6-g7nv4" podStartSLOduration=0.723624819 podStartE2EDuration="4.689233397s" podCreationTimestamp="2026-04-24 16:39:54 +0000 UTC" firstStartedPulling="2026-04-24 16:39:54.553000654 +0000 UTC m=+64.767725011" lastFinishedPulling="2026-04-24 16:39:58.518609221 +0000 UTC m=+68.733333589" observedRunningTime="2026-04-24 16:39:58.688594964 +0000 UTC m=+68.903319340" watchObservedRunningTime="2026-04-24 16:39:58.689233397 +0000 UTC m=+68.903957776" Apr 24 16:40:01.678360 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:40:01.678325 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kddtl" event={"ID":"f55c507c-3b62-4dd9-9a36-9fd09f9412cb","Type":"ContainerStarted","Data":"56b0f7289dfaee14017a89210b19b9d9ff5774cce9dd1a64a17aadfd8f1a235f"} Apr 24 16:40:01.678774 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:40:01.678472 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:40:01.693336 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:40:01.693289 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kddtl" podStartSLOduration=69.052887094 podStartE2EDuration="1m11.693275586s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:39:58.223602992 +0000 UTC m=+68.438327349" lastFinishedPulling="2026-04-24 16:40:00.863991472 +0000 UTC m=+71.078715841" observedRunningTime="2026-04-24 16:40:01.69298578 +0000 UTC m=+71.907710162" watchObservedRunningTime="2026-04-24 16:40:01.693275586 +0000 UTC m=+71.907999961" Apr 24 16:40:26.571266 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:40:26.571210 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:40:26.571673 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:40:26.571292 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:40:26.571673 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:40:26.571311 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:40:26.571673 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:40:26.571362 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:40:26.571673 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:40:26.571385 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78f7696f44-4cxsn: secret "image-registry-tls" not found Apr 24 16:40:26.571673 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:40:26.571396 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:26.571673 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:40:26.571416 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:26.571673 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:40:26.571449 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls podName:10f41d61-9007-409b-a71a-b7938df11802 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:30.571432562 +0000 UTC m=+160.786156916 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls") pod "image-registry-78f7696f44-4cxsn" (UID: "10f41d61-9007-409b-a71a-b7938df11802") : secret "image-registry-tls" not found Apr 24 16:40:26.571673 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:40:26.571462 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert podName:c579f8b7-4799-4d6b-8770-b69441d1c6b7 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:30.57145645 +0000 UTC m=+160.786180807 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert") pod "ingress-canary-dphxj" (UID: "c579f8b7-4799-4d6b-8770-b69441d1c6b7") : secret "canary-serving-cert" not found Apr 24 16:40:26.571673 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:40:26.571472 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls podName:641dddb3-e48f-4ac9-9fad-88baa8d3cd29 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:30.571467205 +0000 UTC m=+160.786191563 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls") pod "dns-default-884bg" (UID: "641dddb3-e48f-4ac9-9fad-88baa8d3cd29") : secret "dns-default-metrics-tls" not found Apr 24 16:40:32.683243 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:40:32.683211 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kddtl" Apr 24 16:41:00.108714 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:00.108669 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:41:00.109230 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:00.108831 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:41:00.109230 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:00.108905 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs podName:9a75b238-62f3-4139-9303-b235113baa9d nodeName:}" failed. No retries permitted until 2026-04-24 16:43:02.108888112 +0000 UTC m=+252.323612466 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs") pod "network-metrics-daemon-xlwrd" (UID: "9a75b238-62f3-4139-9303-b235113baa9d") : secret "metrics-daemon-secret" not found Apr 24 16:41:01.086902 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.086868 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7d5f68b498-j2k2b"] Apr 24 16:41:01.088847 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.088832 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.092468 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.092446 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 16:41:01.093054 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.093035 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 16:41:01.093186 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.093094 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 16:41:01.093186 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.093103 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 16:41:01.093297 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.093270 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 16:41:01.093346 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.093333 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 16:41:01.093770 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.093754 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-nfnjt\"" Apr 24 16:41:01.103718 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.103695 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7d5f68b498-j2k2b"] Apr 24 16:41:01.215416 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.215377 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.215868 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.215439 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll8h5\" (UniqueName: \"kubernetes.io/projected/cdada5da-f65e-47aa-8f22-abd619fe9d1c-kube-api-access-ll8h5\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.215868 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.215530 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-stats-auth\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.215868 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.215636 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.215868 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.215673 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-default-certificate\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.317007 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.316964 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-stats-auth\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.317160 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.317067 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.317160 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.317141 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-default-certificate\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.317272 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.317167 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.317272 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.317213 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll8h5\" (UniqueName: \"kubernetes.io/projected/cdada5da-f65e-47aa-8f22-abd619fe9d1c-kube-api-access-ll8h5\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.317272 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:01.317234 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:01.317413 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:01.317324 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:01.817302426 +0000 UTC m=+132.032026802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : secret "router-metrics-certs-default" not found Apr 24 16:41:01.317413 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:01.317345 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:01.817334789 +0000 UTC m=+132.032059152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:01.319518 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.319491 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-default-certificate\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.319627 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.319568 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-stats-auth\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.327207 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.327185 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll8h5\" (UniqueName: \"kubernetes.io/projected/cdada5da-f65e-47aa-8f22-abd619fe9d1c-kube-api-access-ll8h5\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.821622 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.821566 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.821622 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:01.821627 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:01.821856 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:01.821710 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:01.821856 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:01.821744 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:02.821730015 +0000 UTC m=+133.036454369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:01.821856 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:01.821764 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:02.82175061 +0000 UTC m=+133.036474965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : secret "router-metrics-certs-default" not found Apr 24 16:41:02.829098 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:02.829053 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:02.829488 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:02.829120 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:02.829488 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:02.829198 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:02.829488 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:02.829248 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:04.829234096 +0000 UTC m=+135.043958450 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:02.829488 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:02.829262 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:04.829256417 +0000 UTC m=+135.043980774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : secret "router-metrics-certs-default" not found Apr 24 16:41:04.844117 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:04.844050 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:04.844511 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:04.844205 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:04.844511 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:04.844219 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:04.844511 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:04.844264 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:08.844249081 +0000 UTC m=+139.058973435 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : secret "router-metrics-certs-default" not found Apr 24 16:41:04.844511 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:04.844324 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:08.844311151 +0000 UTC m=+139.059035505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:06.338462 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:06.338432 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7t5b9_eec44da0-010a-4044-830b-8f9776a91747/dns-node-resolver/0.log" Apr 24 16:41:07.538219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:07.538185 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v6j2z_7282f227-12bf-4645-b003-1131e4895ca0/node-ca/0.log" Apr 24 16:41:08.875255 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:08.875213 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:08.875703 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:08.875275 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:08.875703 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:08.875366 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:08.875703 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:08.875429 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:16.87541534 +0000 UTC m=+147.090139694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : secret "router-metrics-certs-default" not found Apr 24 16:41:08.875703 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:08.875444 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:16.875437672 +0000 UTC m=+147.090162030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:11.196073 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.196033 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2mjt5"] Apr 24 16:41:11.199064 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.199040 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.201570 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.201548 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 16:41:11.201707 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.201659 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:11.202800 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.202779 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 16:41:11.202909 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.202782 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:11.202909 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.202900 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-5h4fk\"" Apr 24 16:41:11.206688 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.206668 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 16:41:11.211416 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.211394 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2mjt5"] Apr 24 16:41:11.295558 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.295521 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-config\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.295730 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.295639 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-trusted-ca\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.295730 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.295664 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-serving-cert\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.295730 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.295711 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5v6g\" (UniqueName: \"kubernetes.io/projected/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-kube-api-access-f5v6g\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.396315 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.396264 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-trusted-ca\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.396315 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.396319 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-serving-cert\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.396488 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.396369 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5v6g\" (UniqueName: \"kubernetes.io/projected/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-kube-api-access-f5v6g\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.396488 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.396401 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-config\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.397039 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.397020 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-config\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.397097 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.397024 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-trusted-ca\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.398697 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.398673 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-serving-cert\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.404252 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.404231 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5v6g\" (UniqueName: \"kubernetes.io/projected/6bb1683c-ac7c-4b67-934a-5a1aa7657a5f-kube-api-access-f5v6g\") pod \"console-operator-9d4b6777b-2mjt5\" (UID: \"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.507900 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.507816 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:11.627774 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:11.627740 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb1683c_ac7c_4b67_934a_5a1aa7657a5f.slice/crio-8dd3846e6585aeb8b25d1643fd13ec3c74af2c50ecee618073913e938ac939c0 WatchSource:0}: Error finding container 8dd3846e6585aeb8b25d1643fd13ec3c74af2c50ecee618073913e938ac939c0: Status 404 returned error can't find the container with id 8dd3846e6585aeb8b25d1643fd13ec3c74af2c50ecee618073913e938ac939c0 Apr 24 16:41:11.628987 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.628964 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2mjt5"] Apr 24 16:41:11.813210 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:11.813121 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" event={"ID":"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f","Type":"ContainerStarted","Data":"8dd3846e6585aeb8b25d1643fd13ec3c74af2c50ecee618073913e938ac939c0"} Apr 24 16:41:13.817642 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:13.817553 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/0.log" Apr 24 16:41:13.817642 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:13.817598 2559 generic.go:358] "Generic (PLEG): container finished" podID="6bb1683c-ac7c-4b67-934a-5a1aa7657a5f" containerID="029c7e57c7462ed98a75f7d5aedd79a4d5e2ea9387a611892a27c8998526ba9b" exitCode=255 Apr 24 16:41:13.817642 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:13.817630 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" event={"ID":"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f","Type":"ContainerDied","Data":"029c7e57c7462ed98a75f7d5aedd79a4d5e2ea9387a611892a27c8998526ba9b"} Apr 24 16:41:13.818209 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:13.817945 2559 scope.go:117] "RemoveContainer" containerID="029c7e57c7462ed98a75f7d5aedd79a4d5e2ea9387a611892a27c8998526ba9b" Apr 24 16:41:14.821178 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:14.821152 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:41:14.821563 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:14.821511 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/0.log" Apr 24 16:41:14.821563 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:14.821544 2559 generic.go:358] "Generic (PLEG): container finished" podID="6bb1683c-ac7c-4b67-934a-5a1aa7657a5f" containerID="652fdf70714cfdfc20680f56473d8761dfd414689eae9dc468d9c33b1f573b06" exitCode=255 Apr 24 16:41:14.821641 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:14.821610 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" event={"ID":"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f","Type":"ContainerDied","Data":"652fdf70714cfdfc20680f56473d8761dfd414689eae9dc468d9c33b1f573b06"} Apr 24 16:41:14.821679 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:14.821642 2559 scope.go:117] "RemoveContainer" containerID="029c7e57c7462ed98a75f7d5aedd79a4d5e2ea9387a611892a27c8998526ba9b" Apr 24 16:41:14.821888 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:14.821857 2559 scope.go:117] "RemoveContainer" containerID="652fdf70714cfdfc20680f56473d8761dfd414689eae9dc468d9c33b1f573b06" Apr 24 16:41:14.822117 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:14.822076 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2mjt5_openshift-console-operator(6bb1683c-ac7c-4b67-934a-5a1aa7657a5f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" podUID="6bb1683c-ac7c-4b67-934a-5a1aa7657a5f" Apr 24 16:41:15.824970 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:15.824940 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:41:15.825418 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:15.825283 2559 scope.go:117] "RemoveContainer" containerID="652fdf70714cfdfc20680f56473d8761dfd414689eae9dc468d9c33b1f573b06" Apr 24 16:41:15.825467 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:15.825454 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2mjt5_openshift-console-operator(6bb1683c-ac7c-4b67-934a-5a1aa7657a5f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" podUID="6bb1683c-ac7c-4b67-934a-5a1aa7657a5f" Apr 24 16:41:16.932466 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:16.932423 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:16.932896 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:16.932477 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:16.932896 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:16.932571 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:16.932896 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:16.932610 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:32.932590042 +0000 UTC m=+163.147314400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:16.932896 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:16.932636 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs podName:cdada5da-f65e-47aa-8f22-abd619fe9d1c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:32.932623587 +0000 UTC m=+163.147347944 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs") pod "router-default-7d5f68b498-j2k2b" (UID: "cdada5da-f65e-47aa-8f22-abd619fe9d1c") : secret "router-metrics-certs-default" not found Apr 24 16:41:17.722460 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.722428 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-h4qmc"] Apr 24 16:41:17.726570 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.726549 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.728904 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.728887 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:41:17.729033 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.728993 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:41:17.729033 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.729010 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:41:17.729167 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.729096 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:41:17.729880 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.729866 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8vx6n\"" Apr 24 16:41:17.735738 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.735715 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h4qmc"] Apr 24 16:41:17.738256 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.738236 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.738353 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.738282 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4e1e680b-97f4-47ec-a54e-c610228971eb-crio-socket\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.738353 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.738309 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4e1e680b-97f4-47ec-a54e-c610228971eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.738460 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.738358 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5wz\" (UniqueName: \"kubernetes.io/projected/4e1e680b-97f4-47ec-a54e-c610228971eb-kube-api-access-8w5wz\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.738521 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.738466 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e1e680b-97f4-47ec-a54e-c610228971eb-data-volume\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.839673 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.839626 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.839873 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.839679 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4e1e680b-97f4-47ec-a54e-c610228971eb-crio-socket\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.839873 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.839750 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4e1e680b-97f4-47ec-a54e-c610228971eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.839873 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:17.839770 2559 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:17.839873 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.839803 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5wz\" (UniqueName: \"kubernetes.io/projected/4e1e680b-97f4-47ec-a54e-c610228971eb-kube-api-access-8w5wz\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.839873 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:17.839835 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls podName:4e1e680b-97f4-47ec-a54e-c610228971eb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:18.339819272 +0000 UTC m=+148.554543626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-h4qmc" (UID: "4e1e680b-97f4-47ec-a54e-c610228971eb") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:17.839873 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.839834 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4e1e680b-97f4-47ec-a54e-c610228971eb-crio-socket\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.840161 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.840006 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e1e680b-97f4-47ec-a54e-c610228971eb-data-volume\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.840320 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.840300 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e1e680b-97f4-47ec-a54e-c610228971eb-data-volume\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.840357 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.840308 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4e1e680b-97f4-47ec-a54e-c610228971eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:17.847915 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:17.847895 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5wz\" (UniqueName: \"kubernetes.io/projected/4e1e680b-97f4-47ec-a54e-c610228971eb-kube-api-access-8w5wz\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:18.320306 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.320274 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-7ktpq"] Apr 24 16:41:18.323304 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.323287 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.325913 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.325889 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 16:41:18.326032 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.325899 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 16:41:18.326830 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.326809 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 16:41:18.326936 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.326854 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-pwc64\"" Apr 24 16:41:18.326936 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.326885 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 16:41:18.330201 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.330179 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-7ktpq"] Apr 24 16:41:18.347684 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.347653 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/279e55dc-9053-42b2-a866-94a1153e0925-signing-key\") pod \"service-ca-865cb79987-7ktpq\" (UID: \"279e55dc-9053-42b2-a866-94a1153e0925\") " pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.347810 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.347698 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tqrd\" (UniqueName: \"kubernetes.io/projected/279e55dc-9053-42b2-a866-94a1153e0925-kube-api-access-7tqrd\") pod \"service-ca-865cb79987-7ktpq\" (UID: \"279e55dc-9053-42b2-a866-94a1153e0925\") " pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.347810 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.347726 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/279e55dc-9053-42b2-a866-94a1153e0925-signing-cabundle\") pod \"service-ca-865cb79987-7ktpq\" (UID: \"279e55dc-9053-42b2-a866-94a1153e0925\") " pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.347922 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.347806 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:18.347985 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:18.347966 2559 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:18.348062 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:18.348050 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls podName:4e1e680b-97f4-47ec-a54e-c610228971eb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:19.348030089 +0000 UTC m=+149.562754456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-h4qmc" (UID: "4e1e680b-97f4-47ec-a54e-c610228971eb") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:18.448510 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.448459 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tqrd\" (UniqueName: \"kubernetes.io/projected/279e55dc-9053-42b2-a866-94a1153e0925-kube-api-access-7tqrd\") pod \"service-ca-865cb79987-7ktpq\" (UID: \"279e55dc-9053-42b2-a866-94a1153e0925\") " pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.448510 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.448527 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/279e55dc-9053-42b2-a866-94a1153e0925-signing-cabundle\") pod \"service-ca-865cb79987-7ktpq\" (UID: \"279e55dc-9053-42b2-a866-94a1153e0925\") " pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.448756 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.448659 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/279e55dc-9053-42b2-a866-94a1153e0925-signing-key\") pod \"service-ca-865cb79987-7ktpq\" (UID: \"279e55dc-9053-42b2-a866-94a1153e0925\") " pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.449266 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.449243 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/279e55dc-9053-42b2-a866-94a1153e0925-signing-cabundle\") pod \"service-ca-865cb79987-7ktpq\" (UID: \"279e55dc-9053-42b2-a866-94a1153e0925\") " pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.450938 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.450920 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/279e55dc-9053-42b2-a866-94a1153e0925-signing-key\") pod \"service-ca-865cb79987-7ktpq\" (UID: \"279e55dc-9053-42b2-a866-94a1153e0925\") " pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.459288 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.459269 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tqrd\" (UniqueName: \"kubernetes.io/projected/279e55dc-9053-42b2-a866-94a1153e0925-kube-api-access-7tqrd\") pod \"service-ca-865cb79987-7ktpq\" (UID: \"279e55dc-9053-42b2-a866-94a1153e0925\") " pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.633435 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.633343 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-7ktpq" Apr 24 16:41:18.748032 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.748000 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-7ktpq"] Apr 24 16:41:18.750696 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:18.750659 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod279e55dc_9053_42b2_a866_94a1153e0925.slice/crio-61761fab44febf61165e3256e9616f41b1f906debed29a5ef8e6e1274c2d0664 WatchSource:0}: Error finding container 61761fab44febf61165e3256e9616f41b1f906debed29a5ef8e6e1274c2d0664: Status 404 returned error can't find the container with id 61761fab44febf61165e3256e9616f41b1f906debed29a5ef8e6e1274c2d0664 Apr 24 16:41:18.832821 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:18.832786 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-7ktpq" event={"ID":"279e55dc-9053-42b2-a866-94a1153e0925","Type":"ContainerStarted","Data":"61761fab44febf61165e3256e9616f41b1f906debed29a5ef8e6e1274c2d0664"} Apr 24 16:41:19.354145 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:19.354104 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:19.354565 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:19.354244 2559 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:19.354565 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:19.354324 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls podName:4e1e680b-97f4-47ec-a54e-c610228971eb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:21.354305538 +0000 UTC m=+151.569029893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-h4qmc" (UID: "4e1e680b-97f4-47ec-a54e-c610228971eb") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:21.370590 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:21.370549 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:21.370968 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:21.370689 2559 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:21.370968 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:21.370758 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls podName:4e1e680b-97f4-47ec-a54e-c610228971eb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:25.370742752 +0000 UTC m=+155.585467109 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-h4qmc" (UID: "4e1e680b-97f4-47ec-a54e-c610228971eb") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:21.509045 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:21.508954 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:21.509045 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:21.508992 2559 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:21.509423 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:21.509406 2559 scope.go:117] "RemoveContainer" containerID="652fdf70714cfdfc20680f56473d8761dfd414689eae9dc468d9c33b1f573b06" Apr 24 16:41:21.509588 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:21.509571 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2mjt5_openshift-console-operator(6bb1683c-ac7c-4b67-934a-5a1aa7657a5f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" podUID="6bb1683c-ac7c-4b67-934a-5a1aa7657a5f" Apr 24 16:41:21.842415 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:21.842315 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-7ktpq" event={"ID":"279e55dc-9053-42b2-a866-94a1153e0925","Type":"ContainerStarted","Data":"693e9a816a0137135275bedc0ccc2936549cc10eda77c81803796a39975aca77"} Apr 24 16:41:21.861567 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:21.861512 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-7ktpq" podStartSLOduration=1.43922476 podStartE2EDuration="3.861496479s" podCreationTimestamp="2026-04-24 16:41:18 +0000 UTC" firstStartedPulling="2026-04-24 16:41:18.752557469 +0000 UTC m=+148.967281823" lastFinishedPulling="2026-04-24 16:41:21.174829185 +0000 UTC m=+151.389553542" observedRunningTime="2026-04-24 16:41:21.860998115 +0000 UTC m=+152.075722506" watchObservedRunningTime="2026-04-24 16:41:21.861496479 +0000 UTC m=+152.076220855" Apr 24 16:41:25.404594 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:25.404557 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:25.405073 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:25.404717 2559 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:25.405073 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:25.404817 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls podName:4e1e680b-97f4-47ec-a54e-c610228971eb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:33.404796151 +0000 UTC m=+163.619520513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-h4qmc" (UID: "4e1e680b-97f4-47ec-a54e-c610228971eb") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:25.657222 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:25.657132 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" podUID="10f41d61-9007-409b-a71a-b7938df11802" Apr 24 16:41:25.667851 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:25.667830 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dphxj" podUID="c579f8b7-4799-4d6b-8770-b69441d1c6b7" Apr 24 16:41:25.684236 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:25.684206 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-884bg" podUID="641dddb3-e48f-4ac9-9fad-88baa8d3cd29" Apr 24 16:41:25.852553 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:25.852524 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:41:25.852742 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:25.852566 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-884bg" Apr 24 16:41:25.852828 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:25.852814 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:41:27.414475 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:27.414427 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-xlwrd" podUID="9a75b238-62f3-4139-9303-b235113baa9d" Apr 24 16:41:30.650808 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.650766 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:41:30.651315 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.650824 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:41:30.651315 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.651111 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:41:30.653269 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.653243 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/641dddb3-e48f-4ac9-9fad-88baa8d3cd29-metrics-tls\") pod \"dns-default-884bg\" (UID: \"641dddb3-e48f-4ac9-9fad-88baa8d3cd29\") " pod="openshift-dns/dns-default-884bg" Apr 24 16:41:30.653482 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.653462 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c579f8b7-4799-4d6b-8770-b69441d1c6b7-cert\") pod \"ingress-canary-dphxj\" (UID: \"c579f8b7-4799-4d6b-8770-b69441d1c6b7\") " pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:41:30.653758 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.653737 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"image-registry-78f7696f44-4cxsn\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:41:30.956256 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.956226 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzj8j\"" Apr 24 16:41:30.956256 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.956249 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6j55m\"" Apr 24 16:41:30.956487 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.956259 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lhhc7\"" Apr 24 16:41:30.963367 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.963351 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-884bg" Apr 24 16:41:30.963430 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.963418 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:41:30.963524 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:30.963510 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dphxj" Apr 24 16:41:31.098325 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:31.098292 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dphxj"] Apr 24 16:41:31.101975 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:31.101947 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc579f8b7_4799_4d6b_8770_b69441d1c6b7.slice/crio-42369af1b7a5c6eaa4c526bce2c4133e5c43ceab9867b628c5f52381ac6fd84b WatchSource:0}: Error finding container 42369af1b7a5c6eaa4c526bce2c4133e5c43ceab9867b628c5f52381ac6fd84b: Status 404 returned error can't find the container with id 42369af1b7a5c6eaa4c526bce2c4133e5c43ceab9867b628c5f52381ac6fd84b Apr 24 16:41:31.325559 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:31.325488 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-78f7696f44-4cxsn"] Apr 24 16:41:31.328666 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:31.328615 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-884bg"] Apr 24 16:41:31.329210 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:31.329180 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10f41d61_9007_409b_a71a_b7938df11802.slice/crio-5d1f73619ccf48ed9b46bf162df09624164a541722eb771fa0d21253166c8fec WatchSource:0}: Error finding container 5d1f73619ccf48ed9b46bf162df09624164a541722eb771fa0d21253166c8fec: Status 404 returned error can't find the container with id 5d1f73619ccf48ed9b46bf162df09624164a541722eb771fa0d21253166c8fec Apr 24 16:41:31.330990 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:31.330970 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod641dddb3_e48f_4ac9_9fad_88baa8d3cd29.slice/crio-0f66fb91abb26ca05aad718f4aa81275f4366624d83154220019bbbe9c439217 WatchSource:0}: Error finding container 0f66fb91abb26ca05aad718f4aa81275f4366624d83154220019bbbe9c439217: Status 404 returned error can't find the container with id 0f66fb91abb26ca05aad718f4aa81275f4366624d83154220019bbbe9c439217 Apr 24 16:41:31.872572 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:31.872197 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-884bg" event={"ID":"641dddb3-e48f-4ac9-9fad-88baa8d3cd29","Type":"ContainerStarted","Data":"0f66fb91abb26ca05aad718f4aa81275f4366624d83154220019bbbe9c439217"} Apr 24 16:41:31.876300 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:31.875181 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dphxj" event={"ID":"c579f8b7-4799-4d6b-8770-b69441d1c6b7","Type":"ContainerStarted","Data":"42369af1b7a5c6eaa4c526bce2c4133e5c43ceab9867b628c5f52381ac6fd84b"} Apr 24 16:41:31.877815 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:31.877784 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" event={"ID":"10f41d61-9007-409b-a71a-b7938df11802","Type":"ContainerStarted","Data":"55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3"} Apr 24 16:41:31.877942 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:31.877825 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" event={"ID":"10f41d61-9007-409b-a71a-b7938df11802","Type":"ContainerStarted","Data":"5d1f73619ccf48ed9b46bf162df09624164a541722eb771fa0d21253166c8fec"} Apr 24 16:41:31.878313 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:31.878295 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:41:31.902294 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:31.901556 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" podStartSLOduration=172.901535668 podStartE2EDuration="2m52.901535668s" podCreationTimestamp="2026-04-24 16:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:31.900359783 +0000 UTC m=+162.115084171" watchObservedRunningTime="2026-04-24 16:41:31.901535668 +0000 UTC m=+162.116260044" Apr 24 16:41:32.970523 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:32.970488 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:32.971029 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:32.970627 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:32.971381 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:32.971354 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdada5da-f65e-47aa-8f22-abd619fe9d1c-service-ca-bundle\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:32.973847 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:32.973820 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdada5da-f65e-47aa-8f22-abd619fe9d1c-metrics-certs\") pod \"router-default-7d5f68b498-j2k2b\" (UID: \"cdada5da-f65e-47aa-8f22-abd619fe9d1c\") " pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:33.197518 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.197494 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:33.357669 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.357530 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7d5f68b498-j2k2b"] Apr 24 16:41:33.476705 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.476185 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:33.478771 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.478742 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e1e680b-97f4-47ec-a54e-c610228971eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qmc\" (UID: \"4e1e680b-97f4-47ec-a54e-c610228971eb\") " pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:33.635373 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.635325 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h4qmc" Apr 24 16:41:33.752449 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.752366 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h4qmc"] Apr 24 16:41:33.757848 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:33.757817 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e1e680b_97f4_47ec_a54e_c610228971eb.slice/crio-936a5fd8387261011d224c6ba0274098e9735c70dfc07015f21981e456290ded WatchSource:0}: Error finding container 936a5fd8387261011d224c6ba0274098e9735c70dfc07015f21981e456290ded: Status 404 returned error can't find the container with id 936a5fd8387261011d224c6ba0274098e9735c70dfc07015f21981e456290ded Apr 24 16:41:33.884298 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.884259 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7d5f68b498-j2k2b" event={"ID":"cdada5da-f65e-47aa-8f22-abd619fe9d1c","Type":"ContainerStarted","Data":"640c644444f67bb7446b23a36b5b049a01101213f01c68c97676c439834e9ee0"} Apr 24 16:41:33.884298 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.884303 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7d5f68b498-j2k2b" event={"ID":"cdada5da-f65e-47aa-8f22-abd619fe9d1c","Type":"ContainerStarted","Data":"d8e63486c7e32f88dd95e853499250ec659c823021751a400be877cba157b159"} Apr 24 16:41:33.885574 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.885548 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4qmc" event={"ID":"4e1e680b-97f4-47ec-a54e-c610228971eb","Type":"ContainerStarted","Data":"ae0800bcb9ddcb67f96d6dfc73ea5a9c6ea576335751f64698705295526715ea"} Apr 24 16:41:33.885699 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.885582 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4qmc" event={"ID":"4e1e680b-97f4-47ec-a54e-c610228971eb","Type":"ContainerStarted","Data":"936a5fd8387261011d224c6ba0274098e9735c70dfc07015f21981e456290ded"} Apr 24 16:41:33.887248 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.887210 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-884bg" event={"ID":"641dddb3-e48f-4ac9-9fad-88baa8d3cd29","Type":"ContainerStarted","Data":"dcddc65f001c10bcde87a17923172a3d1dc1237b6053eefbf4df42af0ce0013a"} Apr 24 16:41:33.887351 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.887254 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-884bg" event={"ID":"641dddb3-e48f-4ac9-9fad-88baa8d3cd29","Type":"ContainerStarted","Data":"f4b9c28f9199e76a490f836d06b660d66b1df0dca0f0e9960022d82f41fcf7a1"} Apr 24 16:41:33.887404 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.887375 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-884bg" Apr 24 16:41:33.888462 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.888443 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dphxj" event={"ID":"c579f8b7-4799-4d6b-8770-b69441d1c6b7","Type":"ContainerStarted","Data":"4910f272cbb2ce2fa5a9f5785a7b324d76fa98c17078b273f02f91beb8913af7"} Apr 24 16:41:33.902055 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.902008 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7d5f68b498-j2k2b" podStartSLOduration=32.901995608 podStartE2EDuration="32.901995608s" podCreationTimestamp="2026-04-24 16:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:33.901875141 +0000 UTC m=+164.116599516" watchObservedRunningTime="2026-04-24 16:41:33.901995608 +0000 UTC m=+164.116719984" Apr 24 16:41:33.916997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.916945 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dphxj" podStartSLOduration=129.83423342 podStartE2EDuration="2m11.916931847s" podCreationTimestamp="2026-04-24 16:39:22 +0000 UTC" firstStartedPulling="2026-04-24 16:41:31.104584749 +0000 UTC m=+161.319309106" lastFinishedPulling="2026-04-24 16:41:33.187283179 +0000 UTC m=+163.402007533" observedRunningTime="2026-04-24 16:41:33.916437826 +0000 UTC m=+164.131162237" watchObservedRunningTime="2026-04-24 16:41:33.916931847 +0000 UTC m=+164.131656223" Apr 24 16:41:33.932360 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:33.932313 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-884bg" podStartSLOduration=130.076689509 podStartE2EDuration="2m11.932300468s" podCreationTimestamp="2026-04-24 16:39:22 +0000 UTC" firstStartedPulling="2026-04-24 16:41:31.332992278 +0000 UTC m=+161.547716635" lastFinishedPulling="2026-04-24 16:41:33.188603225 +0000 UTC m=+163.403327594" observedRunningTime="2026-04-24 16:41:33.931855679 +0000 UTC m=+164.146580057" watchObservedRunningTime="2026-04-24 16:41:33.932300468 +0000 UTC m=+164.147024838" Apr 24 16:41:34.198437 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:34.198389 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:34.201288 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:34.201260 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:34.894438 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:34.894401 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4qmc" event={"ID":"4e1e680b-97f4-47ec-a54e-c610228971eb","Type":"ContainerStarted","Data":"98d8553669110e5b466f093be73d0eade6f3537f80c51819320f05f6000e744c"} Apr 24 16:41:34.894963 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:34.894937 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:34.896145 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:34.896125 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7d5f68b498-j2k2b" Apr 24 16:41:35.404502 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:35.404459 2559 scope.go:117] "RemoveContainer" containerID="652fdf70714cfdfc20680f56473d8761dfd414689eae9dc468d9c33b1f573b06" Apr 24 16:41:35.898703 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:35.898675 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:41:35.898889 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:35.898760 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" event={"ID":"6bb1683c-ac7c-4b67-934a-5a1aa7657a5f","Type":"ContainerStarted","Data":"c39084e4308a942c04329b19ad5c3ddc80743628ae0eb1d8f4b46a63272f8a4f"} Apr 24 16:41:35.899099 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:35.899061 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:35.900796 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:35.900769 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4qmc" event={"ID":"4e1e680b-97f4-47ec-a54e-c610228971eb","Type":"ContainerStarted","Data":"53f36f783e62d311dbfb8170c81c375d06ae791cded6a141adc4d38be03d5521"} Apr 24 16:41:35.915359 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:35.915314 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" podStartSLOduration=23.058641582 podStartE2EDuration="24.915299567s" podCreationTimestamp="2026-04-24 16:41:11 +0000 UTC" firstStartedPulling="2026-04-24 16:41:11.629796786 +0000 UTC m=+141.844521140" lastFinishedPulling="2026-04-24 16:41:13.486454768 +0000 UTC m=+143.701179125" observedRunningTime="2026-04-24 16:41:35.914419458 +0000 UTC m=+166.129143835" watchObservedRunningTime="2026-04-24 16:41:35.915299567 +0000 UTC m=+166.130023943" Apr 24 16:41:35.931060 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:35.931000 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-h4qmc" podStartSLOduration=17.081597678 podStartE2EDuration="18.930981973s" podCreationTimestamp="2026-04-24 16:41:17 +0000 UTC" firstStartedPulling="2026-04-24 16:41:33.813311126 +0000 UTC m=+164.028035482" lastFinishedPulling="2026-04-24 16:41:35.662695422 +0000 UTC m=+165.877419777" observedRunningTime="2026-04-24 16:41:35.930538901 +0000 UTC m=+166.145263279" watchObservedRunningTime="2026-04-24 16:41:35.930981973 +0000 UTC m=+166.145706352" Apr 24 16:41:36.033869 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:36.033839 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2mjt5" Apr 24 16:41:37.382591 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.382558 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-78f7696f44-4cxsn"] Apr 24 16:41:37.458983 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.458950 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j"] Apr 24 16:41:37.463553 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.463532 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" Apr 24 16:41:37.466875 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.466852 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-kshj5\"" Apr 24 16:41:37.467230 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.467215 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 16:41:37.476827 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.476798 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j"] Apr 24 16:41:37.609448 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.609349 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3d26cad2-132b-46c7-b8cc-03e2fc8f5ab1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hc22j\" (UID: \"3d26cad2-132b-46c7-b8cc-03e2fc8f5ab1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" Apr 24 16:41:37.709943 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.709905 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3d26cad2-132b-46c7-b8cc-03e2fc8f5ab1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hc22j\" (UID: \"3d26cad2-132b-46c7-b8cc-03e2fc8f5ab1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" Apr 24 16:41:37.713123 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.713064 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3d26cad2-132b-46c7-b8cc-03e2fc8f5ab1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hc22j\" (UID: \"3d26cad2-132b-46c7-b8cc-03e2fc8f5ab1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" Apr 24 16:41:37.772764 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.772727 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" Apr 24 16:41:37.900143 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.900003 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j"] Apr 24 16:41:37.902655 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:37.902624 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d26cad2_132b_46c7_b8cc_03e2fc8f5ab1.slice/crio-44778544c33783c2815afbd37acd2a3e43b523241cbb3a164d8613d016d3cdfd WatchSource:0}: Error finding container 44778544c33783c2815afbd37acd2a3e43b523241cbb3a164d8613d016d3cdfd: Status 404 returned error can't find the container with id 44778544c33783c2815afbd37acd2a3e43b523241cbb3a164d8613d016d3cdfd Apr 24 16:41:37.907147 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:37.907115 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" event={"ID":"3d26cad2-132b-46c7-b8cc-03e2fc8f5ab1","Type":"ContainerStarted","Data":"44778544c33783c2815afbd37acd2a3e43b523241cbb3a164d8613d016d3cdfd"} Apr 24 16:41:39.913699 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:39.913660 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" event={"ID":"3d26cad2-132b-46c7-b8cc-03e2fc8f5ab1","Type":"ContainerStarted","Data":"137c4f6f8b23e0626fa38172ebe3ca890978442a8407df486c2d71c8fcaccb11"} Apr 24 16:41:39.914191 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:39.913856 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" Apr 24 16:41:39.918446 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:39.918421 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" Apr 24 16:41:39.931958 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:39.931914 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hc22j" podStartSLOduration=1.829520957 podStartE2EDuration="2.931900392s" podCreationTimestamp="2026-04-24 16:41:37 +0000 UTC" firstStartedPulling="2026-04-24 16:41:37.904690052 +0000 UTC m=+168.119414408" lastFinishedPulling="2026-04-24 16:41:39.007069489 +0000 UTC m=+169.221793843" observedRunningTime="2026-04-24 16:41:39.931103741 +0000 UTC m=+170.145828120" watchObservedRunningTime="2026-04-24 16:41:39.931900392 +0000 UTC m=+170.146624768" Apr 24 16:41:40.372004 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.371929 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-sf9xh"] Apr 24 16:41:40.375475 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.375458 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.378060 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.378036 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 16:41:40.379249 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.379231 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:41:40.379391 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.379305 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:41:40.379391 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.379319 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:41:40.379391 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.379323 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 16:41:40.379391 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.379366 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-wfw4r\"" Apr 24 16:41:40.382210 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.382189 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-sf9xh"] Apr 24 16:41:40.405183 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.405153 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:41:40.430773 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.430739 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60317e3c-b457-4ed8-bb09-3da52605d686-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.430945 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.430805 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60317e3c-b457-4ed8-bb09-3da52605d686-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.430945 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.430835 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/60317e3c-b457-4ed8-bb09-3da52605d686-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.430945 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.430883 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46lzd\" (UniqueName: \"kubernetes.io/projected/60317e3c-b457-4ed8-bb09-3da52605d686-kube-api-access-46lzd\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.531933 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.531890 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60317e3c-b457-4ed8-bb09-3da52605d686-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.532118 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.531943 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/60317e3c-b457-4ed8-bb09-3da52605d686-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.532118 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.531998 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46lzd\" (UniqueName: \"kubernetes.io/projected/60317e3c-b457-4ed8-bb09-3da52605d686-kube-api-access-46lzd\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.532118 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.532045 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60317e3c-b457-4ed8-bb09-3da52605d686-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.532303 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:40.532181 2559 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 16:41:40.532303 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:40.532264 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60317e3c-b457-4ed8-bb09-3da52605d686-prometheus-operator-tls podName:60317e3c-b457-4ed8-bb09-3da52605d686 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:41.032246608 +0000 UTC m=+171.246970980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/60317e3c-b457-4ed8-bb09-3da52605d686-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-sf9xh" (UID: "60317e3c-b457-4ed8-bb09-3da52605d686") : secret "prometheus-operator-tls" not found Apr 24 16:41:40.532836 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.532816 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60317e3c-b457-4ed8-bb09-3da52605d686-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.534389 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.534366 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60317e3c-b457-4ed8-bb09-3da52605d686-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:40.541153 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:40.541126 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46lzd\" (UniqueName: \"kubernetes.io/projected/60317e3c-b457-4ed8-bb09-3da52605d686-kube-api-access-46lzd\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:41.036254 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.036201 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/60317e3c-b457-4ed8-bb09-3da52605d686-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:41.038684 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.038652 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/60317e3c-b457-4ed8-bb09-3da52605d686-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-sf9xh\" (UID: \"60317e3c-b457-4ed8-bb09-3da52605d686\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:41.284674 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.284634 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" Apr 24 16:41:41.403926 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.403894 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-sf9xh"] Apr 24 16:41:41.407256 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:41.407229 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60317e3c_b457_4ed8_bb09_3da52605d686.slice/crio-4f988ae377f7d8094c9934d425255b5f73192013a147250608e6604a634cd9a5 WatchSource:0}: Error finding container 4f988ae377f7d8094c9934d425255b5f73192013a147250608e6604a634cd9a5: Status 404 returned error can't find the container with id 4f988ae377f7d8094c9934d425255b5f73192013a147250608e6604a634cd9a5 Apr 24 16:41:41.751817 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.751781 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bd6c8fbb6-zg8m6"] Apr 24 16:41:41.756143 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.756126 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.760425 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.760397 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4q5zg\"" Apr 24 16:41:41.760425 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.760420 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:41:41.761298 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.761279 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:41:41.761637 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.761615 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:41:41.761753 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.761617 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:41:41.761753 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.761620 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:41:41.762091 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.762062 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:41:41.763558 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.763544 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:41:41.770597 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.770572 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bd6c8fbb6-zg8m6"] Apr 24 16:41:41.843248 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.843213 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-serving-cert\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.843425 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.843274 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-service-ca\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.843425 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.843312 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-console-config\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.843425 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.843337 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wljx8\" (UniqueName: \"kubernetes.io/projected/73f11d77-41cc-4f15-8180-379368016c8b-kube-api-access-wljx8\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.843425 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.843383 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-oauth-serving-cert\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.843425 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.843413 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-oauth-config\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.919695 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.919652 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" event={"ID":"60317e3c-b457-4ed8-bb09-3da52605d686","Type":"ContainerStarted","Data":"4f988ae377f7d8094c9934d425255b5f73192013a147250608e6604a634cd9a5"} Apr 24 16:41:41.944369 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.944338 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-console-config\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.944547 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.944391 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wljx8\" (UniqueName: \"kubernetes.io/projected/73f11d77-41cc-4f15-8180-379368016c8b-kube-api-access-wljx8\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.944547 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.944442 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-oauth-serving-cert\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.944547 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.944473 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-oauth-config\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.944715 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.944582 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-serving-cert\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.944715 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.944660 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-service-ca\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.945361 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.945336 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-oauth-serving-cert\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.945483 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.945340 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-service-ca\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.945741 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.945708 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-console-config\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.947275 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.947229 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-oauth-config\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.947457 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.947435 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-serving-cert\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:41.953225 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:41.953185 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wljx8\" (UniqueName: \"kubernetes.io/projected/73f11d77-41cc-4f15-8180-379368016c8b-kube-api-access-wljx8\") pod \"console-6bd6c8fbb6-zg8m6\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:42.065280 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:42.065191 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:42.203060 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:42.203031 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bd6c8fbb6-zg8m6"] Apr 24 16:41:42.490016 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:42.489977 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73f11d77_41cc_4f15_8180_379368016c8b.slice/crio-e20bb10333ea8eb198558b9e02b362e605866d46150bb92c35cf3cbbb6ade87c WatchSource:0}: Error finding container e20bb10333ea8eb198558b9e02b362e605866d46150bb92c35cf3cbbb6ade87c: Status 404 returned error can't find the container with id e20bb10333ea8eb198558b9e02b362e605866d46150bb92c35cf3cbbb6ade87c Apr 24 16:41:42.924156 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:42.924118 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd6c8fbb6-zg8m6" event={"ID":"73f11d77-41cc-4f15-8180-379368016c8b","Type":"ContainerStarted","Data":"e20bb10333ea8eb198558b9e02b362e605866d46150bb92c35cf3cbbb6ade87c"} Apr 24 16:41:42.925997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:42.925955 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" event={"ID":"60317e3c-b457-4ed8-bb09-3da52605d686","Type":"ContainerStarted","Data":"1d08aeda3f6440f7e436965abb268a64f6afe0d665c293de4383e3dfb3bf271b"} Apr 24 16:41:42.925997 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:42.925990 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" event={"ID":"60317e3c-b457-4ed8-bb09-3da52605d686","Type":"ContainerStarted","Data":"98dab8659e11574f17773643ddf8dcded59beb276e1af6d011d26ba88d6cd39d"} Apr 24 16:41:42.944781 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:42.944722 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-sf9xh" podStartSLOduration=1.819197454 podStartE2EDuration="2.944703736s" podCreationTimestamp="2026-04-24 16:41:40 +0000 UTC" firstStartedPulling="2026-04-24 16:41:41.409130567 +0000 UTC m=+171.623854926" lastFinishedPulling="2026-04-24 16:41:42.534636849 +0000 UTC m=+172.749361208" observedRunningTime="2026-04-24 16:41:42.942769765 +0000 UTC m=+173.157494153" watchObservedRunningTime="2026-04-24 16:41:42.944703736 +0000 UTC m=+173.159428112" Apr 24 16:41:43.897171 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:43.897138 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-884bg" Apr 24 16:41:44.781905 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.781694 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7jj5d"] Apr 24 16:41:44.791007 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.790885 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.794760 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.794346 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:41:44.794760 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.794419 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sv478\"" Apr 24 16:41:44.794760 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.794346 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:41:44.795339 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.795214 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:41:44.871386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.870943 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bf3d26f4-bc53-404d-bbcf-94226902e26e-root\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.871386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.870988 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf3d26f4-bc53-404d-bbcf-94226902e26e-metrics-client-ca\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.871386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.871021 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-textfile\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.871386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.871046 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9wck\" (UniqueName: \"kubernetes.io/projected/bf3d26f4-bc53-404d-bbcf-94226902e26e-kube-api-access-j9wck\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.871386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.871117 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf3d26f4-bc53-404d-bbcf-94226902e26e-sys\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.871386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.871144 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.871386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.871187 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-wtmp\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.871386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.871218 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-tls\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.871386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.871253 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-accelerators-collector-config\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.972741 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.972701 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-wtmp\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.972752 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-tls\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.972791 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-accelerators-collector-config\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.972837 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bf3d26f4-bc53-404d-bbcf-94226902e26e-root\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.972862 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf3d26f4-bc53-404d-bbcf-94226902e26e-metrics-client-ca\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.972891 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-textfile\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.972914 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9wck\" (UniqueName: \"kubernetes.io/projected/bf3d26f4-bc53-404d-bbcf-94226902e26e-kube-api-access-j9wck\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.972922 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-wtmp\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.972987 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf3d26f4-bc53-404d-bbcf-94226902e26e-sys\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.973021 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973219 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.973164 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bf3d26f4-bc53-404d-bbcf-94226902e26e-root\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973757 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:44.973267 2559 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 16:41:44.973757 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:41:44.973338 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-tls podName:bf3d26f4-bc53-404d-bbcf-94226902e26e nodeName:}" failed. No retries permitted until 2026-04-24 16:41:45.473305755 +0000 UTC m=+175.688030126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-tls") pod "node-exporter-7jj5d" (UID: "bf3d26f4-bc53-404d-bbcf-94226902e26e") : secret "node-exporter-tls" not found Apr 24 16:41:44.973757 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.973529 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf3d26f4-bc53-404d-bbcf-94226902e26e-sys\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973904 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.973801 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-textfile\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.973904 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.973886 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf3d26f4-bc53-404d-bbcf-94226902e26e-metrics-client-ca\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.974044 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.974001 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-accelerators-collector-config\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.976928 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.976903 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:44.984309 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:44.984266 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9wck\" (UniqueName: \"kubernetes.io/projected/bf3d26f4-bc53-404d-bbcf-94226902e26e-kube-api-access-j9wck\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:45.477138 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:45.477097 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-tls\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:45.479508 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:45.479470 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bf3d26f4-bc53-404d-bbcf-94226902e26e-node-exporter-tls\") pod \"node-exporter-7jj5d\" (UID: \"bf3d26f4-bc53-404d-bbcf-94226902e26e\") " pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:45.704844 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:45.704803 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7jj5d" Apr 24 16:41:45.715002 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:41:45.714962 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf3d26f4_bc53_404d_bbcf_94226902e26e.slice/crio-b8155daa33bdb3d7e895f146f97864c17f1d601ec1358a69a8dfcee9b1ea367b WatchSource:0}: Error finding container b8155daa33bdb3d7e895f146f97864c17f1d601ec1358a69a8dfcee9b1ea367b: Status 404 returned error can't find the container with id b8155daa33bdb3d7e895f146f97864c17f1d601ec1358a69a8dfcee9b1ea367b Apr 24 16:41:45.935612 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:45.935578 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jj5d" event={"ID":"bf3d26f4-bc53-404d-bbcf-94226902e26e","Type":"ContainerStarted","Data":"b8155daa33bdb3d7e895f146f97864c17f1d601ec1358a69a8dfcee9b1ea367b"} Apr 24 16:41:45.936987 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:45.936960 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd6c8fbb6-zg8m6" event={"ID":"73f11d77-41cc-4f15-8180-379368016c8b","Type":"ContainerStarted","Data":"fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99"} Apr 24 16:41:45.966015 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:45.965961 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bd6c8fbb6-zg8m6" podStartSLOduration=2.3559947230000002 podStartE2EDuration="4.96594752s" podCreationTimestamp="2026-04-24 16:41:41 +0000 UTC" firstStartedPulling="2026-04-24 16:41:42.49185613 +0000 UTC m=+172.706580485" lastFinishedPulling="2026-04-24 16:41:45.101808923 +0000 UTC m=+175.316533282" observedRunningTime="2026-04-24 16:41:45.964942609 +0000 UTC m=+176.179666985" watchObservedRunningTime="2026-04-24 16:41:45.96594752 +0000 UTC m=+176.180671895" Apr 24 16:41:46.941337 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:46.941305 2559 generic.go:358] "Generic (PLEG): container finished" podID="bf3d26f4-bc53-404d-bbcf-94226902e26e" containerID="07cb1a1170b3263e408e68cc270f40091de97372ef9320745413c27614938ab3" exitCode=0 Apr 24 16:41:46.941739 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:46.941387 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jj5d" event={"ID":"bf3d26f4-bc53-404d-bbcf-94226902e26e","Type":"ContainerDied","Data":"07cb1a1170b3263e408e68cc270f40091de97372ef9320745413c27614938ab3"} Apr 24 16:41:47.388958 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:47.388884 2559 patch_prober.go:28] interesting pod/image-registry-78f7696f44-4cxsn container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:41:47.389111 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:47.388937 2559 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" podUID="10f41d61-9007-409b-a71a-b7938df11802" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:41:47.946817 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:47.946781 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jj5d" event={"ID":"bf3d26f4-bc53-404d-bbcf-94226902e26e","Type":"ContainerStarted","Data":"c3d6ff60da8d95e3be64257ccbde96cfd658cb745b0eac51e68be1cd3fd85c75"} Apr 24 16:41:47.946817 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:47.946816 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jj5d" event={"ID":"bf3d26f4-bc53-404d-bbcf-94226902e26e","Type":"ContainerStarted","Data":"8c4fa1f7f6baf3c42871addf9a740114a663675a0bc3d931717eded5b0392edb"} Apr 24 16:41:47.967839 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:47.967789 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7jj5d" podStartSLOduration=3.299621947 podStartE2EDuration="3.967774054s" podCreationTimestamp="2026-04-24 16:41:44 +0000 UTC" firstStartedPulling="2026-04-24 16:41:45.716779163 +0000 UTC m=+175.931503518" lastFinishedPulling="2026-04-24 16:41:46.384931272 +0000 UTC m=+176.599655625" observedRunningTime="2026-04-24 16:41:47.966048206 +0000 UTC m=+178.180772617" watchObservedRunningTime="2026-04-24 16:41:47.967774054 +0000 UTC m=+178.182498429" Apr 24 16:41:52.066134 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:52.066096 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:52.066579 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:52.066145 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:52.071626 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:52.071604 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:52.964762 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:52.964733 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:41:57.386709 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:41:57.386678 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:42:02.402449 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.402375 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" podUID="10f41d61-9007-409b-a71a-b7938df11802" containerName="registry" containerID="cri-o://55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3" gracePeriod=30 Apr 24 16:42:02.652340 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.652271 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:42:02.729125 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729067 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-installation-pull-secrets\") pod \"10f41d61-9007-409b-a71a-b7938df11802\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " Apr 24 16:42:02.729325 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729140 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-bound-sa-token\") pod \"10f41d61-9007-409b-a71a-b7938df11802\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " Apr 24 16:42:02.729325 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729186 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-image-registry-private-configuration\") pod \"10f41d61-9007-409b-a71a-b7938df11802\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " Apr 24 16:42:02.729325 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729216 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10f41d61-9007-409b-a71a-b7938df11802-ca-trust-extracted\") pod \"10f41d61-9007-409b-a71a-b7938df11802\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " Apr 24 16:42:02.729325 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729243 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-trusted-ca\") pod \"10f41d61-9007-409b-a71a-b7938df11802\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " Apr 24 16:42:02.729325 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729284 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbjkn\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-kube-api-access-pbjkn\") pod \"10f41d61-9007-409b-a71a-b7938df11802\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " Apr 24 16:42:02.729325 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729311 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-registry-certificates\") pod \"10f41d61-9007-409b-a71a-b7938df11802\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " Apr 24 16:42:02.729626 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729361 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") pod \"10f41d61-9007-409b-a71a-b7938df11802\" (UID: \"10f41d61-9007-409b-a71a-b7938df11802\") " Apr 24 16:42:02.729850 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729785 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "10f41d61-9007-409b-a71a-b7938df11802" (UID: "10f41d61-9007-409b-a71a-b7938df11802"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:02.730159 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.729989 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "10f41d61-9007-409b-a71a-b7938df11802" (UID: "10f41d61-9007-409b-a71a-b7938df11802"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:02.732351 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.732297 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "10f41d61-9007-409b-a71a-b7938df11802" (UID: "10f41d61-9007-409b-a71a-b7938df11802"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:02.732466 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.732371 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-kube-api-access-pbjkn" (OuterVolumeSpecName: "kube-api-access-pbjkn") pod "10f41d61-9007-409b-a71a-b7938df11802" (UID: "10f41d61-9007-409b-a71a-b7938df11802"). InnerVolumeSpecName "kube-api-access-pbjkn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:02.732466 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.732417 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "10f41d61-9007-409b-a71a-b7938df11802" (UID: "10f41d61-9007-409b-a71a-b7938df11802"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:02.732587 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.732434 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "10f41d61-9007-409b-a71a-b7938df11802" (UID: "10f41d61-9007-409b-a71a-b7938df11802"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:02.732814 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.732766 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "10f41d61-9007-409b-a71a-b7938df11802" (UID: "10f41d61-9007-409b-a71a-b7938df11802"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:02.748749 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.748703 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f41d61-9007-409b-a71a-b7938df11802-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "10f41d61-9007-409b-a71a-b7938df11802" (UID: "10f41d61-9007-409b-a71a-b7938df11802"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:42:02.830138 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.830103 2559 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10f41d61-9007-409b-a71a-b7938df11802-ca-trust-extracted\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.830138 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.830132 2559 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-trusted-ca\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.830138 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.830141 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pbjkn\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-kube-api-access-pbjkn\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.830379 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.830152 2559 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10f41d61-9007-409b-a71a-b7938df11802-registry-certificates\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.830379 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.830161 2559 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-registry-tls\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.830379 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.830170 2559 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-installation-pull-secrets\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.830379 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.830178 2559 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10f41d61-9007-409b-a71a-b7938df11802-bound-sa-token\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.830379 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.830189 2559 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/10f41d61-9007-409b-a71a-b7938df11802-image-registry-private-configuration\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.988002 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.987912 2559 generic.go:358] "Generic (PLEG): container finished" podID="10f41d61-9007-409b-a71a-b7938df11802" containerID="55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3" exitCode=0 Apr 24 16:42:02.988192 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.988000 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" Apr 24 16:42:02.988192 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.987998 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" event={"ID":"10f41d61-9007-409b-a71a-b7938df11802","Type":"ContainerDied","Data":"55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3"} Apr 24 16:42:02.988192 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.988042 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78f7696f44-4cxsn" event={"ID":"10f41d61-9007-409b-a71a-b7938df11802","Type":"ContainerDied","Data":"5d1f73619ccf48ed9b46bf162df09624164a541722eb771fa0d21253166c8fec"} Apr 24 16:42:02.988192 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.988060 2559 scope.go:117] "RemoveContainer" containerID="55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3" Apr 24 16:42:02.996638 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.996538 2559 scope.go:117] "RemoveContainer" containerID="55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3" Apr 24 16:42:02.996922 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:42:02.996900 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3\": container with ID starting with 55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3 not found: ID does not exist" containerID="55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3" Apr 24 16:42:02.996998 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:02.996935 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3"} err="failed to get container status \"55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3\": rpc error: code = NotFound desc = could not find container \"55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3\": container with ID starting with 55354b3e8d45278ebe7f2f9fcab61387a0b8a824d783a253477849ea0bf9cab3 not found: ID does not exist" Apr 24 16:42:03.012300 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:03.012276 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-78f7696f44-4cxsn"] Apr 24 16:42:03.021961 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:03.021924 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-78f7696f44-4cxsn"] Apr 24 16:42:03.784693 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:03.784652 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bd6c8fbb6-zg8m6"] Apr 24 16:42:04.409065 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:04.409031 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f41d61-9007-409b-a71a-b7938df11802" path="/var/lib/kubelet/pods/10f41d61-9007-409b-a71a-b7938df11802/volumes" Apr 24 16:42:28.803864 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:28.803797 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bd6c8fbb6-zg8m6" podUID="73f11d77-41cc-4f15-8180-379368016c8b" containerName="console" containerID="cri-o://fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99" gracePeriod=15 Apr 24 16:42:29.040558 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.040537 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bd6c8fbb6-zg8m6_73f11d77-41cc-4f15-8180-379368016c8b/console/0.log" Apr 24 16:42:29.040664 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.040597 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:42:29.058138 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.058060 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bd6c8fbb6-zg8m6_73f11d77-41cc-4f15-8180-379368016c8b/console/0.log" Apr 24 16:42:29.058138 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.058109 2559 generic.go:358] "Generic (PLEG): container finished" podID="73f11d77-41cc-4f15-8180-379368016c8b" containerID="fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99" exitCode=2 Apr 24 16:42:29.058302 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.058145 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd6c8fbb6-zg8m6" event={"ID":"73f11d77-41cc-4f15-8180-379368016c8b","Type":"ContainerDied","Data":"fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99"} Apr 24 16:42:29.058302 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.058167 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd6c8fbb6-zg8m6" Apr 24 16:42:29.058302 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.058192 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd6c8fbb6-zg8m6" event={"ID":"73f11d77-41cc-4f15-8180-379368016c8b","Type":"ContainerDied","Data":"e20bb10333ea8eb198558b9e02b362e605866d46150bb92c35cf3cbbb6ade87c"} Apr 24 16:42:29.058302 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.058214 2559 scope.go:117] "RemoveContainer" containerID="fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99" Apr 24 16:42:29.065849 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.065806 2559 scope.go:117] "RemoveContainer" containerID="fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99" Apr 24 16:42:29.066342 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:42:29.066276 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99\": container with ID starting with fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99 not found: ID does not exist" containerID="fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99" Apr 24 16:42:29.066342 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.066311 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99"} err="failed to get container status \"fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99\": rpc error: code = NotFound desc = could not find container \"fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99\": container with ID starting with fd3f310dc1182b45be7b536791ed30013fda583f5a79b1fe8f3a10d895650e99 not found: ID does not exist" Apr 24 16:42:29.137846 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.137812 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wljx8\" (UniqueName: \"kubernetes.io/projected/73f11d77-41cc-4f15-8180-379368016c8b-kube-api-access-wljx8\") pod \"73f11d77-41cc-4f15-8180-379368016c8b\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " Apr 24 16:42:29.137846 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.137854 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-console-config\") pod \"73f11d77-41cc-4f15-8180-379368016c8b\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " Apr 24 16:42:29.138109 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.137893 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-serving-cert\") pod \"73f11d77-41cc-4f15-8180-379368016c8b\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " Apr 24 16:42:29.138109 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.137913 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-oauth-serving-cert\") pod \"73f11d77-41cc-4f15-8180-379368016c8b\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " Apr 24 16:42:29.138109 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.137940 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-oauth-config\") pod \"73f11d77-41cc-4f15-8180-379368016c8b\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " Apr 24 16:42:29.138109 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.137969 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-service-ca\") pod \"73f11d77-41cc-4f15-8180-379368016c8b\" (UID: \"73f11d77-41cc-4f15-8180-379368016c8b\") " Apr 24 16:42:29.138473 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.138436 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-service-ca" (OuterVolumeSpecName: "service-ca") pod "73f11d77-41cc-4f15-8180-379368016c8b" (UID: "73f11d77-41cc-4f15-8180-379368016c8b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:29.138563 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.138465 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-console-config" (OuterVolumeSpecName: "console-config") pod "73f11d77-41cc-4f15-8180-379368016c8b" (UID: "73f11d77-41cc-4f15-8180-379368016c8b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:29.138563 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.138509 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "73f11d77-41cc-4f15-8180-379368016c8b" (UID: "73f11d77-41cc-4f15-8180-379368016c8b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:29.140291 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.140264 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f11d77-41cc-4f15-8180-379368016c8b-kube-api-access-wljx8" (OuterVolumeSpecName: "kube-api-access-wljx8") pod "73f11d77-41cc-4f15-8180-379368016c8b" (UID: "73f11d77-41cc-4f15-8180-379368016c8b"). InnerVolumeSpecName "kube-api-access-wljx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:29.140392 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.140285 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "73f11d77-41cc-4f15-8180-379368016c8b" (UID: "73f11d77-41cc-4f15-8180-379368016c8b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:29.140392 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.140343 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "73f11d77-41cc-4f15-8180-379368016c8b" (UID: "73f11d77-41cc-4f15-8180-379368016c8b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:29.239411 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.239377 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wljx8\" (UniqueName: \"kubernetes.io/projected/73f11d77-41cc-4f15-8180-379368016c8b-kube-api-access-wljx8\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:29.239411 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.239405 2559 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-console-config\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:29.239411 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.239414 2559 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-serving-cert\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:29.239411 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.239423 2559 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-oauth-serving-cert\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:29.239657 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.239431 2559 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73f11d77-41cc-4f15-8180-379368016c8b-console-oauth-config\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:29.239657 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.239441 2559 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73f11d77-41cc-4f15-8180-379368016c8b-service-ca\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:42:29.381906 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.381875 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bd6c8fbb6-zg8m6"] Apr 24 16:42:29.386129 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:29.386105 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bd6c8fbb6-zg8m6"] Apr 24 16:42:30.408501 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:42:30.408460 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f11d77-41cc-4f15-8180-379368016c8b" path="/var/lib/kubelet/pods/73f11d77-41cc-4f15-8180-379368016c8b/volumes" Apr 24 16:43:02.198618 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:02.198573 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:43:02.200835 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:02.200814 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a75b238-62f3-4139-9303-b235113baa9d-metrics-certs\") pod \"network-metrics-daemon-xlwrd\" (UID: \"9a75b238-62f3-4139-9303-b235113baa9d\") " pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:43:02.308696 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:02.308661 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zln8m\"" Apr 24 16:43:02.316510 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:02.316488 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xlwrd" Apr 24 16:43:02.437369 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:02.437337 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xlwrd"] Apr 24 16:43:02.441648 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:43:02.441611 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a75b238_62f3_4139_9303_b235113baa9d.slice/crio-26378f18e8f2e37569e52b4d35a50da9f66cbc2f739a885b15dc1a3fee11146d WatchSource:0}: Error finding container 26378f18e8f2e37569e52b4d35a50da9f66cbc2f739a885b15dc1a3fee11146d: Status 404 returned error can't find the container with id 26378f18e8f2e37569e52b4d35a50da9f66cbc2f739a885b15dc1a3fee11146d Apr 24 16:43:03.144596 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:03.144551 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xlwrd" event={"ID":"9a75b238-62f3-4139-9303-b235113baa9d","Type":"ContainerStarted","Data":"26378f18e8f2e37569e52b4d35a50da9f66cbc2f739a885b15dc1a3fee11146d"} Apr 24 16:43:04.149870 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:04.149470 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xlwrd" event={"ID":"9a75b238-62f3-4139-9303-b235113baa9d","Type":"ContainerStarted","Data":"d5f631b5db6fc306bd93e77db0dc0afbf648bd109b601deb8431f47bf0b2c7c7"} Apr 24 16:43:04.149870 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:04.149513 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xlwrd" event={"ID":"9a75b238-62f3-4139-9303-b235113baa9d","Type":"ContainerStarted","Data":"26f3de071cbb74d520ebf67c4f2d28be1f2cfcc2e4ca1736f5d25d63e83feff8"} Apr 24 16:43:04.170347 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:04.170284 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xlwrd" podStartSLOduration=253.085219819 podStartE2EDuration="4m14.17026439s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:43:02.443543997 +0000 UTC m=+252.658268354" lastFinishedPulling="2026-04-24 16:43:03.528588564 +0000 UTC m=+253.743312925" observedRunningTime="2026-04-24 16:43:04.168025026 +0000 UTC m=+254.382749403" watchObservedRunningTime="2026-04-24 16:43:04.17026439 +0000 UTC m=+254.384988768" Apr 24 16:43:09.231139 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.231029 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-596946bff9-5878t"] Apr 24 16:43:09.231600 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.231386 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10f41d61-9007-409b-a71a-b7938df11802" containerName="registry" Apr 24 16:43:09.231600 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.231403 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f41d61-9007-409b-a71a-b7938df11802" containerName="registry" Apr 24 16:43:09.231600 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.231422 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73f11d77-41cc-4f15-8180-379368016c8b" containerName="console" Apr 24 16:43:09.231600 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.231430 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f11d77-41cc-4f15-8180-379368016c8b" containerName="console" Apr 24 16:43:09.231600 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.231499 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="73f11d77-41cc-4f15-8180-379368016c8b" containerName="console" Apr 24 16:43:09.231600 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.231512 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="10f41d61-9007-409b-a71a-b7938df11802" containerName="registry" Apr 24 16:43:09.234869 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.234848 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.238011 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.237979 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 16:43:09.238460 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.238437 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 16:43:09.238577 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.238558 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 16:43:09.238808 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.238790 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-ttkpn\"" Apr 24 16:43:09.238903 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.238883 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 16:43:09.239105 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.239070 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 16:43:09.245354 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.245312 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 16:43:09.247990 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.247958 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-596946bff9-5878t"] Apr 24 16:43:09.254480 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.254455 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-metrics-client-ca\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.254600 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.254506 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-telemeter-client-tls\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.254600 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.254531 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-764zm\" (UniqueName: \"kubernetes.io/projected/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-kube-api-access-764zm\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.254720 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.254603 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.254720 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.254640 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-federate-client-tls\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.254720 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.254659 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-serving-certs-ca-bundle\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.254720 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.254674 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.254923 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.254727 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-secret-telemeter-client\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.356018 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.355972 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-telemeter-client-tls\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.356216 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.356039 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-764zm\" (UniqueName: \"kubernetes.io/projected/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-kube-api-access-764zm\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.356216 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.356102 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.356216 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.356147 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-federate-client-tls\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.356216 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.356178 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-serving-certs-ca-bundle\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.356216 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.356205 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.356475 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.356249 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-secret-telemeter-client\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.356475 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.356297 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-metrics-client-ca\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.356942 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.356914 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-serving-certs-ca-bundle\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.357121 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.357100 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.357688 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.357668 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-metrics-client-ca\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.358862 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.358835 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.359202 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.359184 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-federate-client-tls\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.359202 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.359189 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-secret-telemeter-client\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.359310 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.359189 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-telemeter-client-tls\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.365446 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.365419 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-764zm\" (UniqueName: \"kubernetes.io/projected/bffc6a3e-834d-435f-bb13-f88d1ac55ab1-kube-api-access-764zm\") pod \"telemeter-client-596946bff9-5878t\" (UID: \"bffc6a3e-834d-435f-bb13-f88d1ac55ab1\") " pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.546179 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.546072 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-596946bff9-5878t" Apr 24 16:43:09.686896 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:09.686855 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-596946bff9-5878t"] Apr 24 16:43:09.691878 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:43:09.691850 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbffc6a3e_834d_435f_bb13_f88d1ac55ab1.slice/crio-aefd0eea4aaa01d4659811a20609640ed6cf14bef120ee47469e009bef07b9b8 WatchSource:0}: Error finding container aefd0eea4aaa01d4659811a20609640ed6cf14bef120ee47469e009bef07b9b8: Status 404 returned error can't find the container with id aefd0eea4aaa01d4659811a20609640ed6cf14bef120ee47469e009bef07b9b8 Apr 24 16:43:10.170314 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:10.170280 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-596946bff9-5878t" event={"ID":"bffc6a3e-834d-435f-bb13-f88d1ac55ab1","Type":"ContainerStarted","Data":"aefd0eea4aaa01d4659811a20609640ed6cf14bef120ee47469e009bef07b9b8"} Apr 24 16:43:11.174528 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:11.174497 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-596946bff9-5878t" event={"ID":"bffc6a3e-834d-435f-bb13-f88d1ac55ab1","Type":"ContainerStarted","Data":"aa93ce5bae95adc4929925b4a8c1d0fa1fdde111040c3ffc727d2805b35d9bc2"} Apr 24 16:43:13.183715 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.183678 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-596946bff9-5878t" event={"ID":"bffc6a3e-834d-435f-bb13-f88d1ac55ab1","Type":"ContainerStarted","Data":"b79c3985446d4121b16048b046040c33255de92bc8400f1108abe43cbb62c40c"} Apr 24 16:43:13.184105 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.183722 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-596946bff9-5878t" event={"ID":"bffc6a3e-834d-435f-bb13-f88d1ac55ab1","Type":"ContainerStarted","Data":"6a861bd9f34e998289650b66f0895919eb9e46e36f875c07853e27457eb3b8d9"} Apr 24 16:43:13.221601 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.221535 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-596946bff9-5878t" podStartSLOduration=1.652998889 podStartE2EDuration="4.221515024s" podCreationTimestamp="2026-04-24 16:43:09 +0000 UTC" firstStartedPulling="2026-04-24 16:43:09.696585426 +0000 UTC m=+259.911309787" lastFinishedPulling="2026-04-24 16:43:12.265101568 +0000 UTC m=+262.479825922" observedRunningTime="2026-04-24 16:43:13.219206449 +0000 UTC m=+263.433930826" watchObservedRunningTime="2026-04-24 16:43:13.221515024 +0000 UTC m=+263.436239402" Apr 24 16:43:13.876832 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.876796 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f54dbb9b8-rccb5"] Apr 24 16:43:13.879972 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.879949 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:13.882596 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.882573 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:43:13.883638 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.883613 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:43:13.883993 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.883811 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:43:13.884070 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.884035 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:43:13.884156 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.884142 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:43:13.884210 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.884158 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:43:13.884426 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.884405 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4q5zg\"" Apr 24 16:43:13.884503 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.884447 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:43:13.888268 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.887949 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:43:13.889585 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.889537 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f54dbb9b8-rccb5"] Apr 24 16:43:13.987634 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.987589 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-oauth-config\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:13.987634 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.987635 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-876km\" (UniqueName: \"kubernetes.io/projected/e79aa462-a617-4d68-9ac6-7407e96947ef-kube-api-access-876km\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:13.987875 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.987719 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-oauth-serving-cert\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:13.987875 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.987754 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-serving-cert\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:13.987875 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.987830 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-console-config\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:13.987875 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.987865 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-service-ca\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:13.988026 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:13.987885 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-trusted-ca-bundle\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.088647 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.088609 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-oauth-config\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.088647 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.088647 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-876km\" (UniqueName: \"kubernetes.io/projected/e79aa462-a617-4d68-9ac6-7407e96947ef-kube-api-access-876km\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.088896 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.088674 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-oauth-serving-cert\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.088896 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.088699 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-serving-cert\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.088896 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.088727 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-console-config\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.089054 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.088913 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-service-ca\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.089054 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.088964 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-trusted-ca-bundle\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.089474 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.089454 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-oauth-serving-cert\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.089559 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.089538 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-service-ca\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.089638 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.089607 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-console-config\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.089884 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.089869 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-trusted-ca-bundle\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.091673 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.091652 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-serving-cert\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.091787 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.091766 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-oauth-config\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.097185 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.097163 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-876km\" (UniqueName: \"kubernetes.io/projected/e79aa462-a617-4d68-9ac6-7407e96947ef-kube-api-access-876km\") pod \"console-7f54dbb9b8-rccb5\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.190925 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.190895 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:14.329403 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:14.329375 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f54dbb9b8-rccb5"] Apr 24 16:43:14.331353 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:43:14.331321 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode79aa462_a617_4d68_9ac6_7407e96947ef.slice/crio-de911428fdaac539add4cf95966a83bebde1711bdbeceac646b9b555066a5a66 WatchSource:0}: Error finding container de911428fdaac539add4cf95966a83bebde1711bdbeceac646b9b555066a5a66: Status 404 returned error can't find the container with id de911428fdaac539add4cf95966a83bebde1711bdbeceac646b9b555066a5a66 Apr 24 16:43:15.190373 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:15.190335 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f54dbb9b8-rccb5" event={"ID":"e79aa462-a617-4d68-9ac6-7407e96947ef","Type":"ContainerStarted","Data":"59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54"} Apr 24 16:43:15.190373 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:15.190372 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f54dbb9b8-rccb5" event={"ID":"e79aa462-a617-4d68-9ac6-7407e96947ef","Type":"ContainerStarted","Data":"de911428fdaac539add4cf95966a83bebde1711bdbeceac646b9b555066a5a66"} Apr 24 16:43:15.209951 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:15.209902 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f54dbb9b8-rccb5" podStartSLOduration=2.209888505 podStartE2EDuration="2.209888505s" podCreationTimestamp="2026-04-24 16:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:43:15.208602001 +0000 UTC m=+265.423326390" watchObservedRunningTime="2026-04-24 16:43:15.209888505 +0000 UTC m=+265.424612880" Apr 24 16:43:24.191282 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:24.191243 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:24.191790 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:24.191326 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:24.195816 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:24.195792 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:24.219024 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:24.218999 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:43:50.264982 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:50.264953 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:43:50.265882 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:50.265862 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:43:50.267725 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:50.267704 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:43:50.268347 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:50.268328 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:43:50.275049 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:43:50.275034 2559 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:44:29.532836 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.532796 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6578576886-sk8bz"] Apr 24 16:44:29.535961 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.535936 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.560070 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.560044 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6578576886-sk8bz"] Apr 24 16:44:29.659073 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.659032 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-oauth-serving-cert\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.659073 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.659092 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-trusted-ca-bundle\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.659333 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.659144 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vq8\" (UniqueName: \"kubernetes.io/projected/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-kube-api-access-78vq8\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.659333 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.659209 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-serving-cert\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.659333 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.659235 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-config\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.659333 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.659256 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-oauth-config\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.659333 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.659285 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-service-ca\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.760517 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.760483 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-config\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.760517 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.760518 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-oauth-config\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.760704 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.760540 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-service-ca\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.760761 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.760746 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-oauth-serving-cert\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.760801 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.760779 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-trusted-ca-bundle\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.760801 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.760795 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78vq8\" (UniqueName: \"kubernetes.io/projected/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-kube-api-access-78vq8\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.760886 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.760829 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-serving-cert\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.761383 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.761359 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-service-ca\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.761521 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.761361 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-config\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.761591 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.761567 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-oauth-serving-cert\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.761717 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.761694 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-trusted-ca-bundle\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.763073 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.763053 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-oauth-config\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.763186 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.763177 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-serving-cert\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.768947 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.768926 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vq8\" (UniqueName: \"kubernetes.io/projected/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-kube-api-access-78vq8\") pod \"console-6578576886-sk8bz\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.846770 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.846672 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:29.972421 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.972398 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6578576886-sk8bz"] Apr 24 16:44:29.975058 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:44:29.975027 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ede5be1_ef30_43e1_9977_4c4299dbb2f9.slice/crio-3b3e7e5e65fd055285066115621eede0a0240af57ac6a829a66d803d86a5996e WatchSource:0}: Error finding container 3b3e7e5e65fd055285066115621eede0a0240af57ac6a829a66d803d86a5996e: Status 404 returned error can't find the container with id 3b3e7e5e65fd055285066115621eede0a0240af57ac6a829a66d803d86a5996e Apr 24 16:44:29.977253 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:29.977236 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:44:30.392254 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:30.392220 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6578576886-sk8bz" event={"ID":"8ede5be1-ef30-43e1-9977-4c4299dbb2f9","Type":"ContainerStarted","Data":"1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7"} Apr 24 16:44:30.392254 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:30.392258 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6578576886-sk8bz" event={"ID":"8ede5be1-ef30-43e1-9977-4c4299dbb2f9","Type":"ContainerStarted","Data":"3b3e7e5e65fd055285066115621eede0a0240af57ac6a829a66d803d86a5996e"} Apr 24 16:44:30.411217 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:30.411170 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6578576886-sk8bz" podStartSLOduration=1.411156267 podStartE2EDuration="1.411156267s" podCreationTimestamp="2026-04-24 16:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:44:30.409841665 +0000 UTC m=+340.624566041" watchObservedRunningTime="2026-04-24 16:44:30.411156267 +0000 UTC m=+340.625880643" Apr 24 16:44:39.847418 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:39.847329 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:39.847418 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:39.847371 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:39.852129 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:39.852104 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:40.421693 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:40.421664 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:44:40.468652 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:44:40.468616 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f54dbb9b8-rccb5"] Apr 24 16:45:05.493012 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.492959 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7f54dbb9b8-rccb5" podUID="e79aa462-a617-4d68-9ac6-7407e96947ef" containerName="console" containerID="cri-o://59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54" gracePeriod=15 Apr 24 16:45:05.729208 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.729183 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f54dbb9b8-rccb5_e79aa462-a617-4d68-9ac6-7407e96947ef/console/0.log" Apr 24 16:45:05.729337 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.729243 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:45:05.835105 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.834995 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-oauth-serving-cert\") pod \"e79aa462-a617-4d68-9ac6-7407e96947ef\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " Apr 24 16:45:05.835105 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835031 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-console-config\") pod \"e79aa462-a617-4d68-9ac6-7407e96947ef\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " Apr 24 16:45:05.835105 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835047 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-trusted-ca-bundle\") pod \"e79aa462-a617-4d68-9ac6-7407e96947ef\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " Apr 24 16:45:05.835394 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835128 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-service-ca\") pod \"e79aa462-a617-4d68-9ac6-7407e96947ef\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " Apr 24 16:45:05.835394 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835157 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-serving-cert\") pod \"e79aa462-a617-4d68-9ac6-7407e96947ef\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " Apr 24 16:45:05.835394 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835184 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-oauth-config\") pod \"e79aa462-a617-4d68-9ac6-7407e96947ef\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " Apr 24 16:45:05.835394 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835236 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-876km\" (UniqueName: \"kubernetes.io/projected/e79aa462-a617-4d68-9ac6-7407e96947ef-kube-api-access-876km\") pod \"e79aa462-a617-4d68-9ac6-7407e96947ef\" (UID: \"e79aa462-a617-4d68-9ac6-7407e96947ef\") " Apr 24 16:45:05.835587 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835487 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e79aa462-a617-4d68-9ac6-7407e96947ef" (UID: "e79aa462-a617-4d68-9ac6-7407e96947ef"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:05.835587 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835538 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-console-config" (OuterVolumeSpecName: "console-config") pod "e79aa462-a617-4d68-9ac6-7407e96947ef" (UID: "e79aa462-a617-4d68-9ac6-7407e96947ef"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:05.835587 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835566 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e79aa462-a617-4d68-9ac6-7407e96947ef" (UID: "e79aa462-a617-4d68-9ac6-7407e96947ef"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:05.835683 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.835604 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-service-ca" (OuterVolumeSpecName: "service-ca") pod "e79aa462-a617-4d68-9ac6-7407e96947ef" (UID: "e79aa462-a617-4d68-9ac6-7407e96947ef"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:05.837375 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.837347 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e79aa462-a617-4d68-9ac6-7407e96947ef" (UID: "e79aa462-a617-4d68-9ac6-7407e96947ef"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:45:05.837486 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.837417 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e79aa462-a617-4d68-9ac6-7407e96947ef" (UID: "e79aa462-a617-4d68-9ac6-7407e96947ef"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:45:05.837486 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.837453 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e79aa462-a617-4d68-9ac6-7407e96947ef-kube-api-access-876km" (OuterVolumeSpecName: "kube-api-access-876km") pod "e79aa462-a617-4d68-9ac6-7407e96947ef" (UID: "e79aa462-a617-4d68-9ac6-7407e96947ef"). InnerVolumeSpecName "kube-api-access-876km". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:45:05.936235 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.936188 2559 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-service-ca\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:45:05.936235 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.936226 2559 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-serving-cert\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:45:05.936235 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.936239 2559 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e79aa462-a617-4d68-9ac6-7407e96947ef-console-oauth-config\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:45:05.936469 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.936252 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-876km\" (UniqueName: \"kubernetes.io/projected/e79aa462-a617-4d68-9ac6-7407e96947ef-kube-api-access-876km\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:45:05.936469 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.936266 2559 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-oauth-serving-cert\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:45:05.936469 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.936278 2559 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-console-config\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:45:05.936469 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:05.936289 2559 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e79aa462-a617-4d68-9ac6-7407e96947ef-trusted-ca-bundle\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:45:06.488208 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.488180 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f54dbb9b8-rccb5_e79aa462-a617-4d68-9ac6-7407e96947ef/console/0.log" Apr 24 16:45:06.488404 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.488223 2559 generic.go:358] "Generic (PLEG): container finished" podID="e79aa462-a617-4d68-9ac6-7407e96947ef" containerID="59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54" exitCode=2 Apr 24 16:45:06.488404 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.488284 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f54dbb9b8-rccb5" Apr 24 16:45:06.488404 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.488293 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f54dbb9b8-rccb5" event={"ID":"e79aa462-a617-4d68-9ac6-7407e96947ef","Type":"ContainerDied","Data":"59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54"} Apr 24 16:45:06.488404 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.488327 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f54dbb9b8-rccb5" event={"ID":"e79aa462-a617-4d68-9ac6-7407e96947ef","Type":"ContainerDied","Data":"de911428fdaac539add4cf95966a83bebde1711bdbeceac646b9b555066a5a66"} Apr 24 16:45:06.488404 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.488346 2559 scope.go:117] "RemoveContainer" containerID="59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54" Apr 24 16:45:06.496138 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.495932 2559 scope.go:117] "RemoveContainer" containerID="59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54" Apr 24 16:45:06.496398 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:45:06.496191 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54\": container with ID starting with 59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54 not found: ID does not exist" containerID="59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54" Apr 24 16:45:06.496398 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.496229 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54"} err="failed to get container status \"59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54\": rpc error: code = NotFound desc = could not find container \"59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54\": container with ID starting with 59a2e8e42ba543c9ca3cb4a04bfebabe6189aeef8678b1631cf1a760710b7f54 not found: ID does not exist" Apr 24 16:45:06.507838 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.507814 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f54dbb9b8-rccb5"] Apr 24 16:45:06.511152 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:06.511133 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f54dbb9b8-rccb5"] Apr 24 16:45:08.407563 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:45:08.407531 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e79aa462-a617-4d68-9ac6-7407e96947ef" path="/var/lib/kubelet/pods/e79aa462-a617-4d68-9ac6-7407e96947ef/volumes" Apr 24 16:46:13.778021 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.777943 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr"] Apr 24 16:46:13.778467 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.778277 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e79aa462-a617-4d68-9ac6-7407e96947ef" containerName="console" Apr 24 16:46:13.778467 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.778289 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="e79aa462-a617-4d68-9ac6-7407e96947ef" containerName="console" Apr 24 16:46:13.778467 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.778339 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="e79aa462-a617-4d68-9ac6-7407e96947ef" containerName="console" Apr 24 16:46:13.781242 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.781226 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:13.784872 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.784841 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 16:46:13.785103 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.785073 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-krwfw\"" Apr 24 16:46:13.785898 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.785882 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 16:46:13.796630 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.796604 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr"] Apr 24 16:46:13.842954 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.842921 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:13.843155 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.842966 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5f4s\" (UniqueName: \"kubernetes.io/projected/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-kube-api-access-w5f4s\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:13.843155 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.843012 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:13.944054 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.944002 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:13.944054 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.944069 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f4s\" (UniqueName: \"kubernetes.io/projected/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-kube-api-access-w5f4s\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:13.944300 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.944108 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:13.944431 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.944408 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:13.944491 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.944439 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:13.953554 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:13.953519 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f4s\" (UniqueName: \"kubernetes.io/projected/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-kube-api-access-w5f4s\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:14.090791 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:14.090689 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:14.219064 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:14.219038 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr"] Apr 24 16:46:14.220134 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:46:14.220105 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7a02fd_8f6f_4ca1_81cd_b3626bf5325f.slice/crio-b071201e9245620e62514eb7cb67358565e3240f6ebe831300ef4ef3a2d08868 WatchSource:0}: Error finding container b071201e9245620e62514eb7cb67358565e3240f6ebe831300ef4ef3a2d08868: Status 404 returned error can't find the container with id b071201e9245620e62514eb7cb67358565e3240f6ebe831300ef4ef3a2d08868 Apr 24 16:46:14.670628 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:14.670588 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" event={"ID":"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f","Type":"ContainerStarted","Data":"b071201e9245620e62514eb7cb67358565e3240f6ebe831300ef4ef3a2d08868"} Apr 24 16:46:19.685916 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:19.685882 2559 generic.go:358] "Generic (PLEG): container finished" podID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerID="e6946557b7bca700fbddfac3c9f97f8452dda7beda467bdb900c2222974ac748" exitCode=0 Apr 24 16:46:19.686327 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:19.685971 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" event={"ID":"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f","Type":"ContainerDied","Data":"e6946557b7bca700fbddfac3c9f97f8452dda7beda467bdb900c2222974ac748"} Apr 24 16:46:21.693806 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:21.693725 2559 generic.go:358] "Generic (PLEG): container finished" podID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerID="d005dd073ec7df619f66af3e592781be95193bcbea59f886a101092001ad9853" exitCode=0 Apr 24 16:46:21.693806 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:21.693764 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" event={"ID":"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f","Type":"ContainerDied","Data":"d005dd073ec7df619f66af3e592781be95193bcbea59f886a101092001ad9853"} Apr 24 16:46:29.719342 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:29.719308 2559 generic.go:358] "Generic (PLEG): container finished" podID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerID="3f9dfcb045137455a555b5190e3393e97958cf5945b52618dbf86cc2a75ffc8b" exitCode=0 Apr 24 16:46:29.719853 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:29.719393 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" event={"ID":"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f","Type":"ContainerDied","Data":"3f9dfcb045137455a555b5190e3393e97958cf5945b52618dbf86cc2a75ffc8b"} Apr 24 16:46:30.841207 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:30.841184 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:30.997629 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:30.997517 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-bundle\") pod \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " Apr 24 16:46:30.997816 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:30.997643 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-util\") pod \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " Apr 24 16:46:30.997816 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:30.997669 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5f4s\" (UniqueName: \"kubernetes.io/projected/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-kube-api-access-w5f4s\") pod \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\" (UID: \"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f\") " Apr 24 16:46:30.998173 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:30.998150 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-bundle" (OuterVolumeSpecName: "bundle") pod "1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" (UID: "1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:46:30.999837 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:30.999816 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-kube-api-access-w5f4s" (OuterVolumeSpecName: "kube-api-access-w5f4s") pod "1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" (UID: "1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f"). InnerVolumeSpecName "kube-api-access-w5f4s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:46:31.001685 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:31.001656 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-util" (OuterVolumeSpecName: "util") pod "1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" (UID: "1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:46:31.099133 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:31.099098 2559 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-util\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:46:31.099133 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:31.099128 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5f4s\" (UniqueName: \"kubernetes.io/projected/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-kube-api-access-w5f4s\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:46:31.099133 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:31.099139 2559 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f-bundle\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:46:31.726683 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:31.726642 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" event={"ID":"1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f","Type":"ContainerDied","Data":"b071201e9245620e62514eb7cb67358565e3240f6ebe831300ef4ef3a2d08868"} Apr 24 16:46:31.726683 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:31.726682 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cchbtr" Apr 24 16:46:31.726903 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:31.726685 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b071201e9245620e62514eb7cb67358565e3240f6ebe831300ef4ef3a2d08868" Apr 24 16:46:38.349097 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.349049 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt"] Apr 24 16:46:38.349481 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.349385 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerName="extract" Apr 24 16:46:38.349481 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.349398 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerName="extract" Apr 24 16:46:38.349481 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.349422 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerName="util" Apr 24 16:46:38.349481 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.349427 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerName="util" Apr 24 16:46:38.349481 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.349433 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerName="pull" Apr 24 16:46:38.349481 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.349438 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerName="pull" Apr 24 16:46:38.349677 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.349495 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e7a02fd-8f6f-4ca1-81cd-b3626bf5325f" containerName="extract" Apr 24 16:46:38.352371 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.352355 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:46:38.360070 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.360047 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-jlkdq\"" Apr 24 16:46:38.360070 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.360070 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 16:46:38.360239 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.360113 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 16:46:38.361281 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.361263 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 16:46:38.402873 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.402841 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt"] Apr 24 16:46:38.465970 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.465939 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6b9e4a68-896e-4ee5-b118-9e8bf3287e5d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-72pwt\" (UID: \"6b9e4a68-896e-4ee5-b118-9e8bf3287e5d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:46:38.466171 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.465988 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnxhm\" (UniqueName: \"kubernetes.io/projected/6b9e4a68-896e-4ee5-b118-9e8bf3287e5d-kube-api-access-bnxhm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-72pwt\" (UID: \"6b9e4a68-896e-4ee5-b118-9e8bf3287e5d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:46:38.567152 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.567107 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6b9e4a68-896e-4ee5-b118-9e8bf3287e5d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-72pwt\" (UID: \"6b9e4a68-896e-4ee5-b118-9e8bf3287e5d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:46:38.567340 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.567189 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnxhm\" (UniqueName: \"kubernetes.io/projected/6b9e4a68-896e-4ee5-b118-9e8bf3287e5d-kube-api-access-bnxhm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-72pwt\" (UID: \"6b9e4a68-896e-4ee5-b118-9e8bf3287e5d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:46:38.569469 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.569451 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6b9e4a68-896e-4ee5-b118-9e8bf3287e5d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-72pwt\" (UID: \"6b9e4a68-896e-4ee5-b118-9e8bf3287e5d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:46:38.626333 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.626260 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnxhm\" (UniqueName: \"kubernetes.io/projected/6b9e4a68-896e-4ee5-b118-9e8bf3287e5d-kube-api-access-bnxhm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-72pwt\" (UID: \"6b9e4a68-896e-4ee5-b118-9e8bf3287e5d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:46:38.662173 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.662120 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:46:38.813846 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:38.813814 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt"] Apr 24 16:46:38.817779 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:46:38.817751 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9e4a68_896e_4ee5_b118_9e8bf3287e5d.slice/crio-52ec61635f915bf5f0ef13702afe494a3cfad9697c79cff48b8c0fc260365017 WatchSource:0}: Error finding container 52ec61635f915bf5f0ef13702afe494a3cfad9697c79cff48b8c0fc260365017: Status 404 returned error can't find the container with id 52ec61635f915bf5f0ef13702afe494a3cfad9697c79cff48b8c0fc260365017 Apr 24 16:46:39.752907 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:39.752860 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" event={"ID":"6b9e4a68-896e-4ee5-b118-9e8bf3287e5d","Type":"ContainerStarted","Data":"52ec61635f915bf5f0ef13702afe494a3cfad9697c79cff48b8c0fc260365017"} Apr 24 16:46:42.742713 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.742673 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-nfn57"] Apr 24 16:46:42.745980 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.745959 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:42.749729 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.749707 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 16:46:42.749930 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.749914 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 16:46:42.753326 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.753305 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-l558z\"" Apr 24 16:46:42.763134 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.763064 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" event={"ID":"6b9e4a68-896e-4ee5-b118-9e8bf3287e5d","Type":"ContainerStarted","Data":"1aea1372b9e684429158e1007f43d1666b23a700bfebd9556df0ddf09128b615"} Apr 24 16:46:42.763264 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.763246 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:46:42.764923 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.764901 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-nfn57"] Apr 24 16:46:42.889683 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.889628 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" podStartSLOduration=1.5581225600000002 podStartE2EDuration="4.889603791s" podCreationTimestamp="2026-04-24 16:46:38 +0000 UTC" firstStartedPulling="2026-04-24 16:46:38.819490072 +0000 UTC m=+469.034214427" lastFinishedPulling="2026-04-24 16:46:42.150971301 +0000 UTC m=+472.365695658" observedRunningTime="2026-04-24 16:46:42.888871757 +0000 UTC m=+473.103596135" watchObservedRunningTime="2026-04-24 16:46:42.889603791 +0000 UTC m=+473.104328166" Apr 24 16:46:42.906198 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.906162 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/69e6355a-7566-4031-a392-e85ba06193ec-cabundle0\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:42.906355 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.906229 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:42.906355 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:42.906288 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2hg4\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-kube-api-access-k2hg4\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:43.007202 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.007099 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/69e6355a-7566-4031-a392-e85ba06193ec-cabundle0\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:43.007351 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.007211 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:43.007351 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.007238 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2hg4\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-kube-api-access-k2hg4\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:43.007416 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.007348 2559 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:46:43.007416 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.007367 2559 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:46:43.007416 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.007377 2559 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nfn57: references non-existent secret key: ca.crt Apr 24 16:46:43.007504 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.007435 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates podName:69e6355a-7566-4031-a392-e85ba06193ec nodeName:}" failed. No retries permitted until 2026-04-24 16:46:43.507419477 +0000 UTC m=+473.722143830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates") pod "keda-operator-ffbb595cb-nfn57" (UID: "69e6355a-7566-4031-a392-e85ba06193ec") : references non-existent secret key: ca.crt Apr 24 16:46:43.007689 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.007672 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/69e6355a-7566-4031-a392-e85ba06193ec-cabundle0\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:43.027386 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.027355 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2hg4\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-kube-api-access-k2hg4\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:43.173782 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.173749 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p"] Apr 24 16:46:43.177010 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.176991 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.180744 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.180718 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 16:46:43.193810 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.193783 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p"] Apr 24 16:46:43.309406 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.309303 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44754\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-kube-api-access-44754\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.309406 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.309378 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.309639 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.309429 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/880fd61d-d9c9-49f9-8533-d7bc8756d271-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.381471 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.381429 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-tbbp8"] Apr 24 16:46:43.384624 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.384594 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:46:43.387753 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.387729 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 16:46:43.401173 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.401149 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tbbp8"] Apr 24 16:46:43.410028 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.410005 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/880fd61d-d9c9-49f9-8533-d7bc8756d271-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.410185 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.410126 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44754\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-kube-api-access-44754\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.410185 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.410172 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.410315 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.410291 2559 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:46:43.410315 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.410312 2559 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:46:43.410422 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.410331 2559 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 16:46:43.410422 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.410353 2559 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 16:46:43.410422 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.410416 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates podName:880fd61d-d9c9-49f9-8533-d7bc8756d271 nodeName:}" failed. No retries permitted until 2026-04-24 16:46:43.910396695 +0000 UTC m=+474.125121069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates") pod "keda-metrics-apiserver-7c9f485588-vj29p" (UID: "880fd61d-d9c9-49f9-8533-d7bc8756d271") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 16:46:43.410888 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.410866 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/880fd61d-d9c9-49f9-8533-d7bc8756d271-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.425268 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.425240 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44754\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-kube-api-access-44754\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.511146 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.511111 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zszxp\" (UniqueName: \"kubernetes.io/projected/9cf00010-8429-4004-8215-9dc539f709a5-kube-api-access-zszxp\") pod \"keda-admission-cf49989db-tbbp8\" (UID: \"9cf00010-8429-4004-8215-9dc539f709a5\") " pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:46:43.511292 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.511167 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9cf00010-8429-4004-8215-9dc539f709a5-certificates\") pod \"keda-admission-cf49989db-tbbp8\" (UID: \"9cf00010-8429-4004-8215-9dc539f709a5\") " pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:46:43.511292 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.511269 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:43.511424 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.511409 2559 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:46:43.511461 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.511426 2559 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:46:43.511461 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.511435 2559 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nfn57: references non-existent secret key: ca.crt Apr 24 16:46:43.511545 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.511482 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates podName:69e6355a-7566-4031-a392-e85ba06193ec nodeName:}" failed. No retries permitted until 2026-04-24 16:46:44.511465015 +0000 UTC m=+474.726189369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates") pod "keda-operator-ffbb595cb-nfn57" (UID: "69e6355a-7566-4031-a392-e85ba06193ec") : references non-existent secret key: ca.crt Apr 24 16:46:43.612793 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.612706 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zszxp\" (UniqueName: \"kubernetes.io/projected/9cf00010-8429-4004-8215-9dc539f709a5-kube-api-access-zszxp\") pod \"keda-admission-cf49989db-tbbp8\" (UID: \"9cf00010-8429-4004-8215-9dc539f709a5\") " pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:46:43.612793 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.612759 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9cf00010-8429-4004-8215-9dc539f709a5-certificates\") pod \"keda-admission-cf49989db-tbbp8\" (UID: \"9cf00010-8429-4004-8215-9dc539f709a5\") " pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:46:43.615390 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.615366 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9cf00010-8429-4004-8215-9dc539f709a5-certificates\") pod \"keda-admission-cf49989db-tbbp8\" (UID: \"9cf00010-8429-4004-8215-9dc539f709a5\") " pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:46:43.625251 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.625228 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zszxp\" (UniqueName: \"kubernetes.io/projected/9cf00010-8429-4004-8215-9dc539f709a5-kube-api-access-zszxp\") pod \"keda-admission-cf49989db-tbbp8\" (UID: \"9cf00010-8429-4004-8215-9dc539f709a5\") " pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:46:43.695453 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.695414 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:46:43.858404 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.857532 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tbbp8"] Apr 24 16:46:43.860212 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:46:43.860181 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cf00010_8429_4004_8215_9dc539f709a5.slice/crio-c59c5da4ec96cbb3e17ea527a6e9187eb8ea80b093ad588dc0f8056919378ba7 WatchSource:0}: Error finding container c59c5da4ec96cbb3e17ea527a6e9187eb8ea80b093ad588dc0f8056919378ba7: Status 404 returned error can't find the container with id c59c5da4ec96cbb3e17ea527a6e9187eb8ea80b093ad588dc0f8056919378ba7 Apr 24 16:46:43.916338 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:43.916300 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:43.916508 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.916419 2559 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:46:43.916508 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.916433 2559 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:46:43.916508 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.916449 2559 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p: references non-existent secret key: tls.crt Apr 24 16:46:43.916508 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:43.916507 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates podName:880fd61d-d9c9-49f9-8533-d7bc8756d271 nodeName:}" failed. No retries permitted until 2026-04-24 16:46:44.916492383 +0000 UTC m=+475.131216737 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates") pod "keda-metrics-apiserver-7c9f485588-vj29p" (UID: "880fd61d-d9c9-49f9-8533-d7bc8756d271") : references non-existent secret key: tls.crt Apr 24 16:46:44.522483 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:44.522439 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:44.522650 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:44.522633 2559 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:46:44.522738 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:44.522655 2559 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:46:44.522738 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:44.522669 2559 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nfn57: references non-existent secret key: ca.crt Apr 24 16:46:44.522738 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:44.522735 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates podName:69e6355a-7566-4031-a392-e85ba06193ec nodeName:}" failed. No retries permitted until 2026-04-24 16:46:46.522714266 +0000 UTC m=+476.737438632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates") pod "keda-operator-ffbb595cb-nfn57" (UID: "69e6355a-7566-4031-a392-e85ba06193ec") : references non-existent secret key: ca.crt Apr 24 16:46:44.771819 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:44.771781 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tbbp8" event={"ID":"9cf00010-8429-4004-8215-9dc539f709a5","Type":"ContainerStarted","Data":"c59c5da4ec96cbb3e17ea527a6e9187eb8ea80b093ad588dc0f8056919378ba7"} Apr 24 16:46:44.926571 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:44.926544 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:44.926873 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:44.926679 2559 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:46:44.926873 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:44.926692 2559 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:46:44.926873 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:44.926713 2559 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p: references non-existent secret key: tls.crt Apr 24 16:46:44.926873 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:44.926768 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates podName:880fd61d-d9c9-49f9-8533-d7bc8756d271 nodeName:}" failed. No retries permitted until 2026-04-24 16:46:46.926754726 +0000 UTC m=+477.141479080 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates") pod "keda-metrics-apiserver-7c9f485588-vj29p" (UID: "880fd61d-d9c9-49f9-8533-d7bc8756d271") : references non-existent secret key: tls.crt Apr 24 16:46:45.776418 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:45.776383 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tbbp8" event={"ID":"9cf00010-8429-4004-8215-9dc539f709a5","Type":"ContainerStarted","Data":"0b824e4f12a7ecdf8a6b0c3c9fd7fa6406f24612690e002b0b3b35816316eb99"} Apr 24 16:46:45.776611 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:45.776555 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:46:45.793467 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:45.793419 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-tbbp8" podStartSLOduration=1.7362108649999999 podStartE2EDuration="2.79340443s" podCreationTimestamp="2026-04-24 16:46:43 +0000 UTC" firstStartedPulling="2026-04-24 16:46:43.861520452 +0000 UTC m=+474.076244814" lastFinishedPulling="2026-04-24 16:46:44.918714025 +0000 UTC m=+475.133438379" observedRunningTime="2026-04-24 16:46:45.792974137 +0000 UTC m=+476.007698513" watchObservedRunningTime="2026-04-24 16:46:45.79340443 +0000 UTC m=+476.008128808" Apr 24 16:46:46.537831 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:46.537792 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:46.538253 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:46.537929 2559 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:46:46.538253 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:46.537948 2559 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:46:46.538253 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:46.537957 2559 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nfn57: references non-existent secret key: ca.crt Apr 24 16:46:46.538253 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:46.538008 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates podName:69e6355a-7566-4031-a392-e85ba06193ec nodeName:}" failed. No retries permitted until 2026-04-24 16:46:50.537994053 +0000 UTC m=+480.752718407 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates") pod "keda-operator-ffbb595cb-nfn57" (UID: "69e6355a-7566-4031-a392-e85ba06193ec") : references non-existent secret key: ca.crt Apr 24 16:46:46.940774 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:46.940746 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:46.940926 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:46.940860 2559 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:46:46.940926 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:46.940872 2559 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:46:46.940926 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:46.940890 2559 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p: references non-existent secret key: tls.crt Apr 24 16:46:46.941050 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:46:46.940943 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates podName:880fd61d-d9c9-49f9-8533-d7bc8756d271 nodeName:}" failed. No retries permitted until 2026-04-24 16:46:50.940927482 +0000 UTC m=+481.155651841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates") pod "keda-metrics-apiserver-7c9f485588-vj29p" (UID: "880fd61d-d9c9-49f9-8533-d7bc8756d271") : references non-existent secret key: tls.crt Apr 24 16:46:50.569522 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:50.569479 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:50.573143 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:50.573113 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69e6355a-7566-4031-a392-e85ba06193ec-certificates\") pod \"keda-operator-ffbb595cb-nfn57\" (UID: \"69e6355a-7566-4031-a392-e85ba06193ec\") " pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:50.855982 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:50.855887 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:50.974479 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:50.974443 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:50.977065 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:50.977039 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/880fd61d-d9c9-49f9-8533-d7bc8756d271-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vj29p\" (UID: \"880fd61d-d9c9-49f9-8533-d7bc8756d271\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:50.980923 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:50.980898 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-nfn57"] Apr 24 16:46:50.983688 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:46:50.983652 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e6355a_7566_4031_a392_e85ba06193ec.slice/crio-b47fd44db9d84d8c795e95ba99e9129612aacb87a304c1d2eb127dc1475df083 WatchSource:0}: Error finding container b47fd44db9d84d8c795e95ba99e9129612aacb87a304c1d2eb127dc1475df083: Status 404 returned error can't find the container with id b47fd44db9d84d8c795e95ba99e9129612aacb87a304c1d2eb127dc1475df083 Apr 24 16:46:50.987398 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:50.987379 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:51.111374 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:51.111344 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p"] Apr 24 16:46:51.113386 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:46:51.113359 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod880fd61d_d9c9_49f9_8533_d7bc8756d271.slice/crio-a6b89ec41bd8c33b9db22c2d474c0632b5537218d4418352892ba040143d77ae WatchSource:0}: Error finding container a6b89ec41bd8c33b9db22c2d474c0632b5537218d4418352892ba040143d77ae: Status 404 returned error can't find the container with id a6b89ec41bd8c33b9db22c2d474c0632b5537218d4418352892ba040143d77ae Apr 24 16:46:51.797803 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:51.797766 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-nfn57" event={"ID":"69e6355a-7566-4031-a392-e85ba06193ec","Type":"ContainerStarted","Data":"b47fd44db9d84d8c795e95ba99e9129612aacb87a304c1d2eb127dc1475df083"} Apr 24 16:46:51.799268 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:51.799227 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" event={"ID":"880fd61d-d9c9-49f9-8533-d7bc8756d271","Type":"ContainerStarted","Data":"a6b89ec41bd8c33b9db22c2d474c0632b5537218d4418352892ba040143d77ae"} Apr 24 16:46:54.813106 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:54.812971 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-nfn57" event={"ID":"69e6355a-7566-4031-a392-e85ba06193ec","Type":"ContainerStarted","Data":"96f022e735ce3501723342afc92ec99cadc64173a6cef626eb2e2678761e0875"} Apr 24 16:46:54.813106 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:54.813022 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:46:54.814284 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:54.814259 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" event={"ID":"880fd61d-d9c9-49f9-8533-d7bc8756d271","Type":"ContainerStarted","Data":"d64e5c0d661d77edf1a768857fa1d081273b14f872480b56fc36b66251ce5ff6"} Apr 24 16:46:54.814418 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:54.814404 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:46:54.883153 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:54.883070 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" podStartSLOduration=8.576152275 podStartE2EDuration="11.8830547s" podCreationTimestamp="2026-04-24 16:46:43 +0000 UTC" firstStartedPulling="2026-04-24 16:46:51.11470842 +0000 UTC m=+481.329432777" lastFinishedPulling="2026-04-24 16:46:54.421610848 +0000 UTC m=+484.636335202" observedRunningTime="2026-04-24 16:46:54.882921273 +0000 UTC m=+485.097645649" watchObservedRunningTime="2026-04-24 16:46:54.8830547 +0000 UTC m=+485.097779075" Apr 24 16:46:54.883827 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:46:54.883792 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-nfn57" podStartSLOduration=9.442755448 podStartE2EDuration="12.883784012s" podCreationTimestamp="2026-04-24 16:46:42 +0000 UTC" firstStartedPulling="2026-04-24 16:46:50.984969639 +0000 UTC m=+481.199693996" lastFinishedPulling="2026-04-24 16:46:54.425998191 +0000 UTC m=+484.640722560" observedRunningTime="2026-04-24 16:46:54.846454267 +0000 UTC m=+485.061178643" watchObservedRunningTime="2026-04-24 16:46:54.883784012 +0000 UTC m=+485.098508387" Apr 24 16:47:03.770669 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:03.770640 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-72pwt" Apr 24 16:47:05.822144 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:05.822113 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vj29p" Apr 24 16:47:06.781895 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:06.781865 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-tbbp8" Apr 24 16:47:15.819619 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:15.819591 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-nfn57" Apr 24 16:47:50.445661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.445581 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-v2dwj"] Apr 24 16:47:50.454461 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.454435 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:47:50.458597 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.458561 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-cert\") pod \"kserve-controller-manager-7f7fb4c66f-v2dwj\" (UID: \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:47:50.458740 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.458686 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hn86\" (UniqueName: \"kubernetes.io/projected/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-kube-api-access-6hn86\") pod \"kserve-controller-manager-7f7fb4c66f-v2dwj\" (UID: \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:47:50.458821 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.458748 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 16:47:50.458875 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.458827 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-k5nnt\"" Apr 24 16:47:50.459180 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.459158 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 16:47:50.459180 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.459175 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 16:47:50.463195 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.463175 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xds52"] Apr 24 16:47:50.466481 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.466461 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-v2dwj"] Apr 24 16:47:50.466568 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.466550 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:47:50.469046 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.469026 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 16:47:50.469163 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.469052 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-7xl4s\"" Apr 24 16:47:50.478120 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.478068 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xds52"] Apr 24 16:47:50.559678 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.559651 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hn86\" (UniqueName: \"kubernetes.io/projected/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-kube-api-access-6hn86\") pod \"kserve-controller-manager-7f7fb4c66f-v2dwj\" (UID: \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:47:50.559865 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.559696 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-cert\") pod \"kserve-controller-manager-7f7fb4c66f-v2dwj\" (UID: \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:47:50.559865 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.559715 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1737964a-86e6-47db-810d-8453f3678972-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xds52\" (UID: \"1737964a-86e6-47db-810d-8453f3678972\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:47:50.559865 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.559758 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ppgc\" (UniqueName: \"kubernetes.io/projected/1737964a-86e6-47db-810d-8453f3678972-kube-api-access-4ppgc\") pod \"llmisvc-controller-manager-68cc5db7c4-xds52\" (UID: \"1737964a-86e6-47db-810d-8453f3678972\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:47:50.562171 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.562148 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-cert\") pod \"kserve-controller-manager-7f7fb4c66f-v2dwj\" (UID: \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:47:50.575556 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.575525 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hn86\" (UniqueName: \"kubernetes.io/projected/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-kube-api-access-6hn86\") pod \"kserve-controller-manager-7f7fb4c66f-v2dwj\" (UID: \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:47:50.660633 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.660603 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ppgc\" (UniqueName: \"kubernetes.io/projected/1737964a-86e6-47db-810d-8453f3678972-kube-api-access-4ppgc\") pod \"llmisvc-controller-manager-68cc5db7c4-xds52\" (UID: \"1737964a-86e6-47db-810d-8453f3678972\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:47:50.660794 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.660687 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1737964a-86e6-47db-810d-8453f3678972-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xds52\" (UID: \"1737964a-86e6-47db-810d-8453f3678972\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:47:50.663072 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.663046 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1737964a-86e6-47db-810d-8453f3678972-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xds52\" (UID: \"1737964a-86e6-47db-810d-8453f3678972\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:47:50.683608 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.683585 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ppgc\" (UniqueName: \"kubernetes.io/projected/1737964a-86e6-47db-810d-8453f3678972-kube-api-access-4ppgc\") pod \"llmisvc-controller-manager-68cc5db7c4-xds52\" (UID: \"1737964a-86e6-47db-810d-8453f3678972\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:47:50.765813 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.765725 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:47:50.778576 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.778553 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:47:50.901639 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.901571 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-v2dwj"] Apr 24 16:47:50.903971 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:47:50.903926 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a21f77e_c24e_4ca5_8fa3_bc8bb67c476b.slice/crio-a3cddb2c6f9997b77e53471c55e75640029a6335c7b30ba444951e6df6d87e0b WatchSource:0}: Error finding container a3cddb2c6f9997b77e53471c55e75640029a6335c7b30ba444951e6df6d87e0b: Status 404 returned error can't find the container with id a3cddb2c6f9997b77e53471c55e75640029a6335c7b30ba444951e6df6d87e0b Apr 24 16:47:50.924607 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:50.924485 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xds52"] Apr 24 16:47:50.926942 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:47:50.926905 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1737964a_86e6_47db_810d_8453f3678972.slice/crio-a1b42abff6b9f98e8ad315703c8d224ae2329b2ed3873cbff0d310a57bdb9381 WatchSource:0}: Error finding container a1b42abff6b9f98e8ad315703c8d224ae2329b2ed3873cbff0d310a57bdb9381: Status 404 returned error can't find the container with id a1b42abff6b9f98e8ad315703c8d224ae2329b2ed3873cbff0d310a57bdb9381 Apr 24 16:47:51.002804 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:51.002767 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" event={"ID":"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b","Type":"ContainerStarted","Data":"a3cddb2c6f9997b77e53471c55e75640029a6335c7b30ba444951e6df6d87e0b"} Apr 24 16:47:51.003809 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:51.003786 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" event={"ID":"1737964a-86e6-47db-810d-8453f3678972","Type":"ContainerStarted","Data":"a1b42abff6b9f98e8ad315703c8d224ae2329b2ed3873cbff0d310a57bdb9381"} Apr 24 16:47:55.020794 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:55.020757 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" event={"ID":"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b","Type":"ContainerStarted","Data":"8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052"} Apr 24 16:47:55.021262 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:55.020973 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:47:55.022117 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:55.022071 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" event={"ID":"1737964a-86e6-47db-810d-8453f3678972","Type":"ContainerStarted","Data":"c0b253f3d561add2ec1f3348a65999ae7c42648541255cca85ed66f0881b88e0"} Apr 24 16:47:55.022216 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:55.022190 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:47:55.040874 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:55.040830 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" podStartSLOduration=1.7731026079999999 podStartE2EDuration="5.040818629s" podCreationTimestamp="2026-04-24 16:47:50 +0000 UTC" firstStartedPulling="2026-04-24 16:47:50.905421042 +0000 UTC m=+541.120145395" lastFinishedPulling="2026-04-24 16:47:54.17313706 +0000 UTC m=+544.387861416" observedRunningTime="2026-04-24 16:47:55.039443885 +0000 UTC m=+545.254168261" watchObservedRunningTime="2026-04-24 16:47:55.040818629 +0000 UTC m=+545.255543005" Apr 24 16:47:55.057122 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:47:55.057059 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" podStartSLOduration=1.8632876870000001 podStartE2EDuration="5.057047095s" podCreationTimestamp="2026-04-24 16:47:50 +0000 UTC" firstStartedPulling="2026-04-24 16:47:50.928263118 +0000 UTC m=+541.142987474" lastFinishedPulling="2026-04-24 16:47:54.122022528 +0000 UTC m=+544.336746882" observedRunningTime="2026-04-24 16:47:55.056672949 +0000 UTC m=+545.271397325" watchObservedRunningTime="2026-04-24 16:47:55.057047095 +0000 UTC m=+545.271771470" Apr 24 16:48:26.028150 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:26.028110 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xds52" Apr 24 16:48:26.031451 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:26.031431 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:48:27.408288 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.408243 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-v2dwj"] Apr 24 16:48:27.408777 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.408516 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" podUID="1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b" containerName="manager" containerID="cri-o://8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052" gracePeriod=10 Apr 24 16:48:27.432817 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.432784 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-xkb86"] Apr 24 16:48:27.498525 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.498494 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-xkb86"] Apr 24 16:48:27.498695 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.498672 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:48:27.584521 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.584472 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rsnb\" (UniqueName: \"kubernetes.io/projected/be6b0951-d5a0-4c6a-9dea-a034b83b63fa-kube-api-access-8rsnb\") pod \"kserve-controller-manager-7f7fb4c66f-xkb86\" (UID: \"be6b0951-d5a0-4c6a-9dea-a034b83b63fa\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:48:27.584719 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.584568 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be6b0951-d5a0-4c6a-9dea-a034b83b63fa-cert\") pod \"kserve-controller-manager-7f7fb4c66f-xkb86\" (UID: \"be6b0951-d5a0-4c6a-9dea-a034b83b63fa\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:48:27.680405 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.680378 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:48:27.685391 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.685357 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be6b0951-d5a0-4c6a-9dea-a034b83b63fa-cert\") pod \"kserve-controller-manager-7f7fb4c66f-xkb86\" (UID: \"be6b0951-d5a0-4c6a-9dea-a034b83b63fa\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:48:27.685639 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.685439 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rsnb\" (UniqueName: \"kubernetes.io/projected/be6b0951-d5a0-4c6a-9dea-a034b83b63fa-kube-api-access-8rsnb\") pod \"kserve-controller-manager-7f7fb4c66f-xkb86\" (UID: \"be6b0951-d5a0-4c6a-9dea-a034b83b63fa\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:48:27.687887 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.687862 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be6b0951-d5a0-4c6a-9dea-a034b83b63fa-cert\") pod \"kserve-controller-manager-7f7fb4c66f-xkb86\" (UID: \"be6b0951-d5a0-4c6a-9dea-a034b83b63fa\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:48:27.693774 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.693749 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rsnb\" (UniqueName: \"kubernetes.io/projected/be6b0951-d5a0-4c6a-9dea-a034b83b63fa-kube-api-access-8rsnb\") pod \"kserve-controller-manager-7f7fb4c66f-xkb86\" (UID: \"be6b0951-d5a0-4c6a-9dea-a034b83b63fa\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:48:27.786005 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.785971 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-cert\") pod \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\" (UID: \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\") " Apr 24 16:48:27.786005 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.786012 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hn86\" (UniqueName: \"kubernetes.io/projected/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-kube-api-access-6hn86\") pod \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\" (UID: \"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b\") " Apr 24 16:48:27.788384 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.788343 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-cert" (OuterVolumeSpecName: "cert") pod "1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b" (UID: "1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:48:27.788384 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.788343 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-kube-api-access-6hn86" (OuterVolumeSpecName: "kube-api-access-6hn86") pod "1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b" (UID: "1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b"). InnerVolumeSpecName "kube-api-access-6hn86". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:48:27.853465 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.853425 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:48:27.887525 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.887478 2559 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-cert\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:48:27.887525 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.887510 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hn86\" (UniqueName: \"kubernetes.io/projected/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b-kube-api-access-6hn86\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:48:27.980548 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:27.980509 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-xkb86"] Apr 24 16:48:27.983840 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:48:27.983812 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe6b0951_d5a0_4c6a_9dea_a034b83b63fa.slice/crio-88590c55bba03f6a99910786242109593c56e794d000dc09a88ebc09ea82cbec WatchSource:0}: Error finding container 88590c55bba03f6a99910786242109593c56e794d000dc09a88ebc09ea82cbec: Status 404 returned error can't find the container with id 88590c55bba03f6a99910786242109593c56e794d000dc09a88ebc09ea82cbec Apr 24 16:48:28.129092 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.129037 2559 generic.go:358] "Generic (PLEG): container finished" podID="1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b" containerID="8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052" exitCode=0 Apr 24 16:48:28.129271 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.129094 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" event={"ID":"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b","Type":"ContainerDied","Data":"8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052"} Apr 24 16:48:28.129271 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.129128 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" Apr 24 16:48:28.129271 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.129141 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-v2dwj" event={"ID":"1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b","Type":"ContainerDied","Data":"a3cddb2c6f9997b77e53471c55e75640029a6335c7b30ba444951e6df6d87e0b"} Apr 24 16:48:28.129271 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.129159 2559 scope.go:117] "RemoveContainer" containerID="8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052" Apr 24 16:48:28.130364 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.130333 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" event={"ID":"be6b0951-d5a0-4c6a-9dea-a034b83b63fa","Type":"ContainerStarted","Data":"88590c55bba03f6a99910786242109593c56e794d000dc09a88ebc09ea82cbec"} Apr 24 16:48:28.137864 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.137847 2559 scope.go:117] "RemoveContainer" containerID="8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052" Apr 24 16:48:28.138182 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:48:28.138151 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052\": container with ID starting with 8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052 not found: ID does not exist" containerID="8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052" Apr 24 16:48:28.138280 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.138183 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052"} err="failed to get container status \"8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052\": rpc error: code = NotFound desc = could not find container \"8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052\": container with ID starting with 8e146aa81ccfffc953a009968079f1dc1d4d70a64385fb308f2096ece8600052 not found: ID does not exist" Apr 24 16:48:28.152165 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.152130 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-v2dwj"] Apr 24 16:48:28.154802 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.154777 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-v2dwj"] Apr 24 16:48:28.408804 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:28.408722 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b" path="/var/lib/kubelet/pods/1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b/volumes" Apr 24 16:48:29.135359 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:29.135323 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" event={"ID":"be6b0951-d5a0-4c6a-9dea-a034b83b63fa","Type":"ContainerStarted","Data":"8525cb60f14e3a05b1bc8bf840814a2f9abf8762daa407dcbe29d6891202ddd3"} Apr 24 16:48:29.135538 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:29.135432 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:48:29.153063 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:29.153009 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" podStartSLOduration=1.477549416 podStartE2EDuration="2.152993914s" podCreationTimestamp="2026-04-24 16:48:27 +0000 UTC" firstStartedPulling="2026-04-24 16:48:27.985099797 +0000 UTC m=+578.199824154" lastFinishedPulling="2026-04-24 16:48:28.660544284 +0000 UTC m=+578.875268652" observedRunningTime="2026-04-24 16:48:29.150831826 +0000 UTC m=+579.365556202" watchObservedRunningTime="2026-04-24 16:48:29.152993914 +0000 UTC m=+579.367718290" Apr 24 16:48:35.845134 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:35.845071 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6578576886-sk8bz"] Apr 24 16:48:50.286533 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:50.286501 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:48:50.287444 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:50.287425 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:48:50.289033 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:50.289010 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:48:50.289858 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:48:50.289842 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:49:00.143661 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:00.143630 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-xkb86" Apr 24 16:49:00.866628 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:00.866571 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6578576886-sk8bz" podUID="8ede5be1-ef30-43e1-9977-4c4299dbb2f9" containerName="console" containerID="cri-o://1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7" gracePeriod=15 Apr 24 16:49:01.110800 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.110776 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6578576886-sk8bz_8ede5be1-ef30-43e1-9977-4c4299dbb2f9/console/0.log" Apr 24 16:49:01.110920 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.110853 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:49:01.155447 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.155361 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-oauth-config\") pod \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " Apr 24 16:49:01.155878 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.155465 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-service-ca\") pod \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " Apr 24 16:49:01.155878 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.155498 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-serving-cert\") pod \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " Apr 24 16:49:01.155878 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.155535 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78vq8\" (UniqueName: \"kubernetes.io/projected/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-kube-api-access-78vq8\") pod \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " Apr 24 16:49:01.155878 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.155626 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-trusted-ca-bundle\") pod \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " Apr 24 16:49:01.155878 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.155651 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-config\") pod \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " Apr 24 16:49:01.155878 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.155696 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-oauth-serving-cert\") pod \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\" (UID: \"8ede5be1-ef30-43e1-9977-4c4299dbb2f9\") " Apr 24 16:49:01.156397 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.156367 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-service-ca" (OuterVolumeSpecName: "service-ca") pod "8ede5be1-ef30-43e1-9977-4c4299dbb2f9" (UID: "8ede5be1-ef30-43e1-9977-4c4299dbb2f9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:01.156512 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.156409 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8ede5be1-ef30-43e1-9977-4c4299dbb2f9" (UID: "8ede5be1-ef30-43e1-9977-4c4299dbb2f9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:01.156743 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.156722 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-config" (OuterVolumeSpecName: "console-config") pod "8ede5be1-ef30-43e1-9977-4c4299dbb2f9" (UID: "8ede5be1-ef30-43e1-9977-4c4299dbb2f9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:01.156998 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.156975 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8ede5be1-ef30-43e1-9977-4c4299dbb2f9" (UID: "8ede5be1-ef30-43e1-9977-4c4299dbb2f9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:01.158254 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.158217 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8ede5be1-ef30-43e1-9977-4c4299dbb2f9" (UID: "8ede5be1-ef30-43e1-9977-4c4299dbb2f9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:49:01.158754 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.158717 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-kube-api-access-78vq8" (OuterVolumeSpecName: "kube-api-access-78vq8") pod "8ede5be1-ef30-43e1-9977-4c4299dbb2f9" (UID: "8ede5be1-ef30-43e1-9977-4c4299dbb2f9"). InnerVolumeSpecName "kube-api-access-78vq8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:49:01.159438 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.159336 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8ede5be1-ef30-43e1-9977-4c4299dbb2f9" (UID: "8ede5be1-ef30-43e1-9977-4c4299dbb2f9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:49:01.238472 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.238445 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6578576886-sk8bz_8ede5be1-ef30-43e1-9977-4c4299dbb2f9/console/0.log" Apr 24 16:49:01.238639 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.238482 2559 generic.go:358] "Generic (PLEG): container finished" podID="8ede5be1-ef30-43e1-9977-4c4299dbb2f9" containerID="1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7" exitCode=2 Apr 24 16:49:01.238639 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.238508 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6578576886-sk8bz" event={"ID":"8ede5be1-ef30-43e1-9977-4c4299dbb2f9","Type":"ContainerDied","Data":"1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7"} Apr 24 16:49:01.238639 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.238529 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6578576886-sk8bz" event={"ID":"8ede5be1-ef30-43e1-9977-4c4299dbb2f9","Type":"ContainerDied","Data":"3b3e7e5e65fd055285066115621eede0a0240af57ac6a829a66d803d86a5996e"} Apr 24 16:49:01.238639 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.238543 2559 scope.go:117] "RemoveContainer" containerID="1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7" Apr 24 16:49:01.238639 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.238560 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6578576886-sk8bz" Apr 24 16:49:01.246565 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.246553 2559 scope.go:117] "RemoveContainer" containerID="1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7" Apr 24 16:49:01.246820 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:49:01.246803 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7\": container with ID starting with 1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7 not found: ID does not exist" containerID="1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7" Apr 24 16:49:01.246874 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.246828 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7"} err="failed to get container status \"1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7\": rpc error: code = NotFound desc = could not find container \"1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7\": container with ID starting with 1851e813c137908b66c5468d140720cacd0a24031beb60cc97fd171066f32bb7 not found: ID does not exist" Apr 24 16:49:01.256228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.256209 2559 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-trusted-ca-bundle\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:49:01.256228 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.256229 2559 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-config\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:49:01.256347 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.256239 2559 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-oauth-serving-cert\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:49:01.256347 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.256249 2559 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-oauth-config\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:49:01.256347 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.256258 2559 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-service-ca\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:49:01.256347 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.256267 2559 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-console-serving-cert\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:49:01.256347 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.256275 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-78vq8\" (UniqueName: \"kubernetes.io/projected/8ede5be1-ef30-43e1-9977-4c4299dbb2f9-kube-api-access-78vq8\") on node \"ip-10-0-137-179.ec2.internal\" DevicePath \"\"" Apr 24 16:49:01.272887 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.272859 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6578576886-sk8bz"] Apr 24 16:49:01.279447 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:01.279427 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6578576886-sk8bz"] Apr 24 16:49:02.408954 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:49:02.408924 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ede5be1-ef30-43e1-9977-4c4299dbb2f9" path="/var/lib/kubelet/pods/8ede5be1-ef30-43e1-9977-4c4299dbb2f9/volumes" Apr 24 16:50:10.735512 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.735477 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x"] Apr 24 16:50:10.735915 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.735773 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ede5be1-ef30-43e1-9977-4c4299dbb2f9" containerName="console" Apr 24 16:50:10.735915 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.735784 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ede5be1-ef30-43e1-9977-4c4299dbb2f9" containerName="console" Apr 24 16:50:10.735915 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.735796 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b" containerName="manager" Apr 24 16:50:10.735915 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.735801 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b" containerName="manager" Apr 24 16:50:10.735915 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.735855 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a21f77e-c24e-4ca5-8fa3-bc8bb67c476b" containerName="manager" Apr 24 16:50:10.735915 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.735865 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ede5be1-ef30-43e1-9977-4c4299dbb2f9" containerName="console" Apr 24 16:50:10.738588 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.738570 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:10.748112 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.748067 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 24 16:50:10.748234 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.748069 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 16:50:10.748234 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.748197 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z9xpl\"" Apr 24 16:50:10.759152 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.759127 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x"] Apr 24 16:50:10.807426 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.807395 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-data\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:10.807577 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.807440 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ds6g\" (UniqueName: \"kubernetes.io/projected/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-kube-api-access-5ds6g\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:10.807577 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.807531 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:10.908100 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.908045 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:10.908280 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.908185 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-data\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:10.908280 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:50:10.908196 2559 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 24 16:50:10.908280 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:50:10.908214 2559 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x: secret "seaweedfs-tls-serving" not found Apr 24 16:50:10.908280 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.908226 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ds6g\" (UniqueName: \"kubernetes.io/projected/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-kube-api-access-5ds6g\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:10.908280 ip-10-0-137-179 kubenswrapper[2559]: E0424 16:50:10.908269 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-seaweedfs-tls-serving podName:777baad7-b9e5-48d2-82d9-9e41c4a18f6a nodeName:}" failed. No retries permitted until 2026-04-24 16:50:11.408247514 +0000 UTC m=+681.622971884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-8pp9x" (UID: "777baad7-b9e5-48d2-82d9-9e41c4a18f6a") : secret "seaweedfs-tls-serving" not found Apr 24 16:50:10.908649 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.908631 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-data\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:10.930927 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:10.930898 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ds6g\" (UniqueName: \"kubernetes.io/projected/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-kube-api-access-5ds6g\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:11.412169 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:11.412131 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:11.414470 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:11.414449 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/777baad7-b9e5-48d2-82d9-9e41c4a18f6a-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-8pp9x\" (UID: \"777baad7-b9e5-48d2-82d9-9e41c4a18f6a\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:11.647730 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:11.647695 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" Apr 24 16:50:11.770987 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:11.770953 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x"] Apr 24 16:50:11.774441 ip-10-0-137-179 kubenswrapper[2559]: W0424 16:50:11.774414 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod777baad7_b9e5_48d2_82d9_9e41c4a18f6a.slice/crio-dd92483acb07ecb48f9f95d531629cd103e577da4152565c73c554e5fc06132c WatchSource:0}: Error finding container dd92483acb07ecb48f9f95d531629cd103e577da4152565c73c554e5fc06132c: Status 404 returned error can't find the container with id dd92483acb07ecb48f9f95d531629cd103e577da4152565c73c554e5fc06132c Apr 24 16:50:11.775782 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:11.775765 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:50:12.466929 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:12.466884 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" event={"ID":"777baad7-b9e5-48d2-82d9-9e41c4a18f6a","Type":"ContainerStarted","Data":"dd92483acb07ecb48f9f95d531629cd103e577da4152565c73c554e5fc06132c"} Apr 24 16:50:14.478633 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:14.478602 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" event={"ID":"777baad7-b9e5-48d2-82d9-9e41c4a18f6a","Type":"ContainerStarted","Data":"a7db7d895488c77676216acded9bcbe1023cf673e9e8d42c8d38721141ae4592"} Apr 24 16:50:14.496383 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:50:14.496296 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-8pp9x" podStartSLOduration=2.026703359 podStartE2EDuration="4.496281301s" podCreationTimestamp="2026-04-24 16:50:10 +0000 UTC" firstStartedPulling="2026-04-24 16:50:11.775878647 +0000 UTC m=+681.990603000" lastFinishedPulling="2026-04-24 16:50:14.245456586 +0000 UTC m=+684.460180942" observedRunningTime="2026-04-24 16:50:14.494788218 +0000 UTC m=+684.709512593" watchObservedRunningTime="2026-04-24 16:50:14.496281301 +0000 UTC m=+684.711005673" Apr 24 16:53:50.309840 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:53:50.309763 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:53:50.312061 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:53:50.312036 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:53:50.312750 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:53:50.312729 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:53:50.314716 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:53:50.314695 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:58:50.332289 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:58:50.332263 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:58:50.334894 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:58:50.334872 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 16:58:50.335528 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:58:50.335506 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 16:58:50.338045 ip-10-0-137-179 kubenswrapper[2559]: I0424 16:58:50.338026 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:03:50.352875 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:03:50.352845 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:03:50.355358 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:03:50.355333 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:03:50.355938 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:03:50.355918 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:03:50.358313 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:03:50.358295 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:08:50.373510 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:08:50.373477 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:08:50.375710 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:08:50.375686 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:08:50.376313 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:08:50.376292 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:08:50.378193 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:08:50.378176 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:13:50.396238 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:13:50.396204 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:13:50.398841 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:13:50.398814 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:13:50.399001 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:13:50.398981 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:13:50.401646 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:13:50.401625 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:18:50.423112 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:18:50.423066 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:18:50.423973 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:18:50.423952 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:18:50.425799 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:18:50.425778 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:18:50.426718 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:18:50.426701 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:23:50.442839 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:23:50.442761 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:23:50.445225 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:23:50.445203 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:23:50.445409 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:23:50.445394 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:23:50.447637 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:23:50.447612 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:28:50.462746 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:28:50.462717 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:28:50.465200 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:28:50.465177 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:28:50.466221 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:28:50.466198 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:28:50.468637 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:28:50.468619 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:33:50.484786 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:33:50.484754 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:33:50.487299 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:33:50.487274 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:33:50.489503 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:33:50.489482 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:33:50.491961 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:33:50.491937 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:38:50.504206 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:38:50.504173 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:38:50.506999 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:38:50.506951 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:38:50.509498 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:38:50.509478 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:38:50.512227 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:38:50.512211 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:43:50.524706 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:43:50.524591 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:43:50.528862 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:43:50.527406 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:43:50.530431 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:43:50.530411 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:43:50.538712 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:43:50.538687 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:46:32.503053 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:32.502976 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dlfcn_e79110fd-148d-4488-a9fe-340f434292f4/global-pull-secret-syncer/0.log" Apr 24 17:46:32.561782 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:32.561754 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-86w64_ac5de78b-4789-4f34-988c-b131198ef66b/konnectivity-agent/0.log" Apr 24 17:46:32.693914 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:32.693874 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-179.ec2.internal_5aeb36898807b4f178264985b1daa831/haproxy/0.log" Apr 24 17:46:36.639643 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:36.639609 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7jj5d_bf3d26f4-bc53-404d-bbcf-94226902e26e/node-exporter/0.log" Apr 24 17:46:36.668952 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:36.668927 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7jj5d_bf3d26f4-bc53-404d-bbcf-94226902e26e/kube-rbac-proxy/0.log" Apr 24 17:46:36.696743 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:36.696722 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7jj5d_bf3d26f4-bc53-404d-bbcf-94226902e26e/init-textfile/0.log" Apr 24 17:46:37.150632 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:37.150606 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-sf9xh_60317e3c-b457-4ed8-bb09-3da52605d686/prometheus-operator/0.log" Apr 24 17:46:37.175851 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:37.175813 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-sf9xh_60317e3c-b457-4ed8-bb09-3da52605d686/kube-rbac-proxy/0.log" Apr 24 17:46:37.207162 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:37.207134 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hc22j_3d26cad2-132b-46c7-b8cc-03e2fc8f5ab1/prometheus-operator-admission-webhook/0.log" Apr 24 17:46:37.254102 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:37.254053 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-596946bff9-5878t_bffc6a3e-834d-435f-bb13-f88d1ac55ab1/telemeter-client/0.log" Apr 24 17:46:37.278993 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:37.278957 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-596946bff9-5878t_bffc6a3e-834d-435f-bb13-f88d1ac55ab1/reload/0.log" Apr 24 17:46:37.307284 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:37.307252 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-596946bff9-5878t_bffc6a3e-834d-435f-bb13-f88d1ac55ab1/kube-rbac-proxy/0.log" Apr 24 17:46:38.912154 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:38.912123 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/1.log" Apr 24 17:46:38.920542 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:38.920514 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2mjt5_6bb1683c-ac7c-4b67-934a-5a1aa7657a5f/console-operator/2.log" Apr 24 17:46:39.453756 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.453723 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b"] Apr 24 17:46:39.457111 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.457070 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.459403 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.459380 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-twkp6\"/\"openshift-service-ca.crt\"" Apr 24 17:46:39.460421 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.460390 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-twkp6\"/\"default-dockercfg-7txp2\"" Apr 24 17:46:39.460486 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.460418 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-twkp6\"/\"kube-root-ca.crt\"" Apr 24 17:46:39.466095 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.466055 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b"] Apr 24 17:46:39.479932 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.479902 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-proc\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.480057 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.479935 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-lib-modules\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.480057 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.479979 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-sys\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.480057 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.480005 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vgg6\" (UniqueName: \"kubernetes.io/projected/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-kube-api-access-9vgg6\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.480183 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.480075 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-podres\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.580875 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.580829 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-proc\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.580875 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.580869 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-lib-modules\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.581148 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.580887 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-sys\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.581148 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.580912 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vgg6\" (UniqueName: \"kubernetes.io/projected/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-kube-api-access-9vgg6\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.581148 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.580940 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-podres\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.581148 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.580963 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-proc\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.581148 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.580983 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-sys\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.581148 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.581032 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-lib-modules\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.581148 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.581042 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-podres\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.590285 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.590261 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vgg6\" (UniqueName: \"kubernetes.io/projected/788b5ab6-c4ab-4bc6-988d-39cc834a7f16-kube-api-access-9vgg6\") pod \"perf-node-gather-daemonset-zhp7b\" (UID: \"788b5ab6-c4ab-4bc6-988d-39cc834a7f16\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.767751 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.767649 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:39.889491 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.889461 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b"] Apr 24 17:46:39.892198 ip-10-0-137-179 kubenswrapper[2559]: W0424 17:46:39.892167 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod788b5ab6_c4ab_4bc6_988d_39cc834a7f16.slice/crio-73261558e475b873b0d66304639d913f829b96ce2282c9e98b200d2c3b1287ec WatchSource:0}: Error finding container 73261558e475b873b0d66304639d913f829b96ce2282c9e98b200d2c3b1287ec: Status 404 returned error can't find the container with id 73261558e475b873b0d66304639d913f829b96ce2282c9e98b200d2c3b1287ec Apr 24 17:46:39.893734 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:39.893717 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:46:40.095494 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:40.095404 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" event={"ID":"788b5ab6-c4ab-4bc6-988d-39cc834a7f16","Type":"ContainerStarted","Data":"ddbceb51cf1aeafa1f82c1b81d559edb775a29a3c0264cddaa6f1573620eebf6"} Apr 24 17:46:40.095494 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:40.095445 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" event={"ID":"788b5ab6-c4ab-4bc6-988d-39cc834a7f16","Type":"ContainerStarted","Data":"73261558e475b873b0d66304639d913f829b96ce2282c9e98b200d2c3b1287ec"} Apr 24 17:46:40.095882 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:40.095551 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:40.112279 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:40.112233 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" podStartSLOduration=1.112218002 podStartE2EDuration="1.112218002s" podCreationTimestamp="2026-04-24 17:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:46:40.110676861 +0000 UTC m=+4070.325401238" watchObservedRunningTime="2026-04-24 17:46:40.112218002 +0000 UTC m=+4070.326942378" Apr 24 17:46:40.327096 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:40.327044 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-884bg_641dddb3-e48f-4ac9-9fad-88baa8d3cd29/dns/0.log" Apr 24 17:46:40.348994 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:40.348915 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-884bg_641dddb3-e48f-4ac9-9fad-88baa8d3cd29/kube-rbac-proxy/0.log" Apr 24 17:46:40.493235 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:40.493204 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7t5b9_eec44da0-010a-4044-830b-8f9776a91747/dns-node-resolver/0.log" Apr 24 17:46:41.079659 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:41.079624 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v6j2z_7282f227-12bf-4645-b003-1131e4895ca0/node-ca/0.log" Apr 24 17:46:41.769412 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:41.769384 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7d5f68b498-j2k2b_cdada5da-f65e-47aa-8f22-abd619fe9d1c/router/0.log" Apr 24 17:46:42.110589 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:42.110500 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dphxj_c579f8b7-4799-4d6b-8770-b69441d1c6b7/serve-healthcheck-canary/0.log" Apr 24 17:46:42.564126 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:42.564100 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h4qmc_4e1e680b-97f4-47ec-a54e-c610228971eb/kube-rbac-proxy/0.log" Apr 24 17:46:42.584260 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:42.584227 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h4qmc_4e1e680b-97f4-47ec-a54e-c610228971eb/exporter/0.log" Apr 24 17:46:42.605869 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:42.605844 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h4qmc_4e1e680b-97f4-47ec-a54e-c610228971eb/extractor/0.log" Apr 24 17:46:44.736919 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:44.736884 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7f7fb4c66f-xkb86_be6b0951-d5a0-4c6a-9dea-a034b83b63fa/manager/0.log" Apr 24 17:46:44.756724 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:44.756689 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-xds52_1737964a-86e6-47db-810d-8453f3678972/manager/0.log" Apr 24 17:46:45.367224 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:45.367192 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-8pp9x_777baad7-b9e5-48d2-82d9-9e41c4a18f6a/seaweedfs-tls-serving/0.log" Apr 24 17:46:46.108795 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:46.108769 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-zhp7b" Apr 24 17:46:50.369820 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:50.369789 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-47g4c_d5140d31-beb5-42da-bdf5-60b7bfc41f79/kube-multus/0.log" Apr 24 17:46:50.602324 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:50.602294 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8gmjx_af7a310a-32d2-48cf-b2d6-69c07daae4b0/kube-multus-additional-cni-plugins/0.log" Apr 24 17:46:50.623170 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:50.623141 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8gmjx_af7a310a-32d2-48cf-b2d6-69c07daae4b0/egress-router-binary-copy/0.log" Apr 24 17:46:50.643921 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:50.643898 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8gmjx_af7a310a-32d2-48cf-b2d6-69c07daae4b0/cni-plugins/0.log" Apr 24 17:46:50.665695 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:50.665668 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8gmjx_af7a310a-32d2-48cf-b2d6-69c07daae4b0/bond-cni-plugin/0.log" Apr 24 17:46:50.687825 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:50.687796 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8gmjx_af7a310a-32d2-48cf-b2d6-69c07daae4b0/routeoverride-cni/0.log" Apr 24 17:46:50.707832 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:50.707804 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8gmjx_af7a310a-32d2-48cf-b2d6-69c07daae4b0/whereabouts-cni-bincopy/0.log" Apr 24 17:46:50.732125 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:50.732027 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8gmjx_af7a310a-32d2-48cf-b2d6-69c07daae4b0/whereabouts-cni/0.log" Apr 24 17:46:51.046985 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.046893 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xlwrd_9a75b238-62f3-4139-9303-b235113baa9d/network-metrics-daemon/0.log" Apr 24 17:46:51.065925 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.065893 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xlwrd_9a75b238-62f3-4139-9303-b235113baa9d/kube-rbac-proxy/0.log" Apr 24 17:46:51.755784 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.755748 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-controller/0.log" Apr 24 17:46:51.771136 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.771111 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/0.log" Apr 24 17:46:51.810514 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.810482 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovn-acl-logging/1.log" Apr 24 17:46:51.833529 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.833489 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/kube-rbac-proxy-node/0.log" Apr 24 17:46:51.854565 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.854540 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 17:46:51.871623 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.871597 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/northd/0.log" Apr 24 17:46:51.906258 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.906231 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/nbdb/0.log" Apr 24 17:46:51.926046 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:51.926018 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/sbdb/0.log" Apr 24 17:46:52.087887 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:52.087780 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f9px_4d355f77-a36f-48e8-a168-c520805efc91/ovnkube-controller/0.log" Apr 24 17:46:53.764693 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:53.764661 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kddtl_f55c507c-3b62-4dd9-9a36-9fd09f9412cb/network-check-target-container/0.log" Apr 24 17:46:54.660030 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:54.659999 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fg8mz_3cf6c854-7267-4e50-9e1d-4fdd84304910/iptables-alerter/0.log" Apr 24 17:46:55.293781 ip-10-0-137-179 kubenswrapper[2559]: I0424 17:46:55.293750 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lkl8j_513c5343-09a1-48a6-8da2-774588de4d58/tuned/0.log"