Apr 20 12:11:51.510950 ip-10-0-135-187 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 12:11:51.510962 ip-10-0-135-187 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 12:11:51.510968 ip-10-0-135-187 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 12:11:51.511221 ip-10-0-135-187 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 12:12:01.563224 ip-10-0-135-187 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 12:12:01.563238 ip-10-0-135-187 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot db127754770d46109fd41a51bf9ab0bb -- Apr 20 12:14:10.376265 ip-10-0-135-187 systemd[1]: Starting Kubernetes Kubelet... Apr 20 12:14:10.971229 ip-10-0-135-187 systemd[1]: Started Kubernetes Kubelet. Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.856614 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861075 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861089 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861094 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861097 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861099 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861102 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861105 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861107 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:10.978195 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861110 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861113 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861116 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861118 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861121 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861124 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861126 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861129 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861132 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861135 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861137 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861139 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861142 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861145 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861147 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861150 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861153 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861156 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861158 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861161 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:10.983947 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861165 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861175 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861178 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861182 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861185 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861188 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861190 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861193 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861195 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861198 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861200 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861203 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861205 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861208 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861210 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861213 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861215 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861218 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861220 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861223 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:11.183076 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861225 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861228 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861230 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861233 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861235 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861238 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861240 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861243 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861245 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861248 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861250 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861253 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861256 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861259 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861261 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861264 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861267 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861269 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861272 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:11.183777 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861274 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861276 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861279 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861281 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861284 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861286 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861288 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861291 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861294 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861299 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861303 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861306 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861309 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861311 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861314 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861316 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861319 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861322 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861324 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:11.184655 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861693 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861699 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861702 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861705 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861707 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861711 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861713 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861716 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861719 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861721 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861724 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861726 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861729 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861732 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861734 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861737 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861739 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861742 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861744 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861747 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:11.185295 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861749 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861752 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861754 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861756 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861759 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861762 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861764 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861767 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861769 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861772 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861775 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861777 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861780 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861783 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861785 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861788 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861790 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861793 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861795 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861798 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:11.185946 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861801 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861804 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861806 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861809 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861811 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861814 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861816 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861819 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861821 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861824 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861826 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861829 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861832 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861835 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861838 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861840 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861843 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861845 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861848 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:11.186658 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861850 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861853 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861859 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861862 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861865 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861869 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861872 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861875 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861877 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861880 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861882 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861885 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861887 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861890 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861892 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861895 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861897 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861900 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861902 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:11.187525 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861904 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861907 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861909 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861912 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861914 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861917 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861919 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.861921 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863471 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863485 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863493 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863501 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863506 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863510 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863514 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863520 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863523 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863526 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863530 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863533 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863536 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863539 2570 flags.go:64] FLAG: --cgroup-root="" Apr 20 12:14:11.188767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863541 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863544 2570 flags.go:64] FLAG: --client-ca-file="" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863547 2570 flags.go:64] FLAG: --cloud-config="" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863550 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863552 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863557 2570 flags.go:64] FLAG: --cluster-domain="" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863559 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863562 2570 flags.go:64] FLAG: --config-dir="" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863566 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863569 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863573 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863575 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863578 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863581 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863586 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863589 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863592 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863595 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863598 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863602 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863605 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863608 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863611 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863615 2570 flags.go:64] FLAG: --enable-server="true" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863618 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 12:14:11.189386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863624 2570 flags.go:64] FLAG: --event-burst="100" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863628 2570 flags.go:64] FLAG: --event-qps="50" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863631 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863634 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863637 2570 flags.go:64] FLAG: --eviction-hard="" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863641 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863644 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863647 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863649 2570 flags.go:64] FLAG: --eviction-soft="" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863652 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863655 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863658 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863661 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863663 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863666 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863669 2570 flags.go:64] FLAG: --feature-gates="" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863673 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863676 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863679 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863682 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863685 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863689 2570 flags.go:64] FLAG: --help="false" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863692 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863695 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 12:14:11.190116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863698 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863700 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863704 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863707 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863710 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863713 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863715 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863718 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863723 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863726 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863729 2570 flags.go:64] FLAG: --kube-reserved="" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863732 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863734 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863737 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863740 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863743 2570 flags.go:64] FLAG: --lock-file="" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863745 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863748 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863752 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863757 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863760 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863763 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863765 2570 flags.go:64] FLAG: --logging-format="text" Apr 20 12:14:11.190728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863768 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863771 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863774 2570 flags.go:64] FLAG: --manifest-url="" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863777 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863781 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863784 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863789 2570 flags.go:64] FLAG: --max-pods="110" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863792 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863795 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863798 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863801 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863804 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863807 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863810 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863816 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863819 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863822 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863827 2570 flags.go:64] FLAG: --pod-cidr="" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863830 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863835 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863838 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863841 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863844 2570 flags.go:64] FLAG: --port="10250" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863847 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 12:14:11.191296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863849 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04aeb57602e60a7e9" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863852 2570 flags.go:64] FLAG: --qos-reserved="" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863855 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863858 2570 flags.go:64] FLAG: --register-node="true" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863861 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863864 2570 flags.go:64] FLAG: --register-with-taints="" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863868 2570 flags.go:64] FLAG: --registry-burst="10" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863870 2570 flags.go:64] FLAG: --registry-qps="5" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863886 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863891 2570 flags.go:64] FLAG: --reserved-memory="" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863895 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863898 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863902 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863904 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863907 2570 flags.go:64] FLAG: --runonce="false" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863911 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863914 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863917 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863920 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863922 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863925 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863928 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863931 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863934 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863937 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863941 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 12:14:11.191900 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863945 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863947 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863950 2570 flags.go:64] FLAG: --system-cgroups="" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863953 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863959 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863962 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863964 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863968 2570 flags.go:64] FLAG: --tls-min-version="" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863971 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863974 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863977 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863979 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863982 2570 flags.go:64] FLAG: --v="2" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863986 2570 flags.go:64] FLAG: --version="false" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863990 2570 flags.go:64] FLAG: --vmodule="" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863994 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.863997 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864104 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864108 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864111 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864114 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864118 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864121 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:11.192539 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864124 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864127 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864129 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864132 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864134 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864137 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864140 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864143 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864146 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864149 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864152 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864154 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864157 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864159 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864162 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864164 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864167 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864169 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864172 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864174 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:11.193083 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864177 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864179 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864182 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864184 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864186 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864189 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864192 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864194 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864196 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864199 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864203 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864206 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864208 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864211 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864213 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864216 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864218 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864222 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864225 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864228 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:11.193589 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864232 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864234 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864237 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864240 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864243 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864245 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864248 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864250 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864252 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864255 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864257 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864260 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864262 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864265 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864267 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864270 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864273 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864275 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864278 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:11.194088 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864280 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864283 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864285 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864288 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864292 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864295 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864298 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864300 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864303 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864306 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864308 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864311 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864313 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864317 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864319 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864322 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864324 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864327 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864329 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:11.194546 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864332 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.864334 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.865060 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.873001 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.873029 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873076 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873081 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873085 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873088 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873091 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873094 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873097 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873100 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873103 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873106 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:11.195058 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873109 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873112 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873115 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873118 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873120 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873123 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873125 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873129 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873134 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873137 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873140 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873142 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873145 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873148 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873150 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873153 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873155 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873158 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873160 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873166 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:11.195435 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873169 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873172 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873175 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873178 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873180 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873183 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873185 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873187 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873190 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873192 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873195 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873197 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873200 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873202 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873205 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873207 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873209 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873212 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873214 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873217 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:11.195914 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873219 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873222 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873224 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873227 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873229 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873232 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873234 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873237 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873241 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873245 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873248 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873254 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873257 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873259 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873263 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873266 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873268 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873271 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873274 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:11.196508 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873276 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873278 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873281 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873284 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873286 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873289 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873291 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873294 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873296 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873299 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873301 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873304 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873306 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873309 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873311 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873314 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:11.196965 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873316 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.873321 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873420 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873424 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873427 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873430 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873433 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873436 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873440 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873444 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873446 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873449 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873452 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873455 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873458 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:11.197360 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873460 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873463 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873465 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873468 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873470 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873473 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873475 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873478 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873480 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873483 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873485 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873488 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873491 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873493 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873496 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873498 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873500 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873503 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873505 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:11.197730 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873507 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873510 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873512 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873515 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873517 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873520 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873523 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873527 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873529 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873532 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873534 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873537 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873539 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873543 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873546 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873550 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873553 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873556 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873559 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:11.198189 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873562 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873564 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873567 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873569 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873572 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873574 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873577 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873579 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873582 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873584 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873587 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873589 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873592 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873594 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873596 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873599 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873601 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873604 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873606 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873610 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:11.198646 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873615 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873617 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873619 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873622 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873624 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873627 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873630 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873632 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873634 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873637 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873639 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873642 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873644 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873647 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:10.873649 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:11.199132 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.873654 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.873755 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.876784 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.877853 2570 server.go:1019] "Starting client certificate rotation" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.878401 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.878443 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.906908 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.910457 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.925705 2570 log.go:25] "Validated CRI v1 runtime API" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.932417 2570 log.go:25] "Validated CRI v1 image API" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.933882 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.934788 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 12:14:11.199501 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.939610 2570 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 834020df-3f20-4edd-bd6c-f64e3a7102f4:/dev/nvme0n1p3 9c3a74f9-0b81-4bca-82a0-5e9b03c85b8e:/dev/nvme0n1p4] Apr 20 12:14:11.199796 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.939631 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 12:14:11.199829 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.946335 2570 manager.go:217] Machine: {Timestamp:2026-04-20 12:14:10.943929066 +0000 UTC m=+0.447488137 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3214510 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d322d0f037c85b3006a2808102a53 SystemUUID:ec2d322d-0f03-7c85-b300-6a2808102a53 BootID:db127754-770d-4610-9fd4-1a51bf9ab0bb Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:29:f8:4d:40:f3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:29:f8:4d:40:f3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:7f:b5:0f:06:5c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 12:14:11.199829 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.946432 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 12:14:11.199829 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.946503 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 12:14:11.199829 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.947643 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 12:14:11.199829 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.947666 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-187.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 12:14:11.199829 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.947798 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 12:14:11.199829 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.947806 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 12:14:11.199829 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.947820 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.948748 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.949996 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.950254 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.952995 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.953038 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.953052 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.953062 2570 kubelet.go:397] "Adding apiserver pod source" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.953071 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.954268 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.954281 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.956091 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hxq89" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.957741 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.963166 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.963597 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hxq89" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.964748 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.964762 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.964768 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.964789 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.964813 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.964821 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.964829 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.964837 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.965216 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.965235 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.965251 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.965267 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:10.966293 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-187.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.966369 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.966385 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:10.966419 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.970292 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.970352 2570 server.go:1295] "Started kubelet" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.970434 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.970492 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.970438 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.971941 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.973062 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.980912 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.981549 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:10.982468 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.982687 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 12:14:11.200127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.982689 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.982705 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.982786 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.982795 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.984641 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.984654 2570 factory.go:55] Registering systemd factory Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.984662 2570 factory.go:223] Registration of the systemd container factory successfully Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.984853 2570 factory.go:153] Registering CRI-O factory Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.984867 2570 factory.go:223] Registration of the crio container factory successfully Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.984895 2570 factory.go:103] Registering Raw factory Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.984952 2570 manager.go:1196] Started watching for new ooms in manager Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.985342 2570 manager.go:319] Starting recovery of all containers Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:10.988595 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.988719 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:10.991588 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-187.ec2.internal\" not found" node="ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.993677 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-187.ec2.internal" not found Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:10.999778 2570 manager.go:324] Recovery completed Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.004781 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.008792 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.008818 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.008828 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.009262 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.009272 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.009291 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.011473 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-187.ec2.internal" not found Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.012151 2570 policy_none.go:49] "None policy: Start" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.012164 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.012173 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.062533 2570 manager.go:341] "Starting Device Plugin manager" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.062563 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.062572 2570 server.go:85] "Starting device plugin registration server" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.062746 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.062756 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.062831 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.062907 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.062915 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.063309 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.063345 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.070978 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-187.ec2.internal" not found Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.132283 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.133404 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.133427 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.133444 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.133452 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 12:14:11.201171 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.133483 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 12:14:11.202237 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.136015 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:11.202237 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.163007 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:11.202237 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.164052 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:11.202237 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.164078 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:11.202237 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.164087 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:11.202237 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.164110 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.202237 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.174745 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.202237 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.174764 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-187.ec2.internal\": node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.202237 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.189827 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.234363 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.234333 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal"] Apr 20 12:14:11.234432 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.234422 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:11.235281 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.235262 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:11.235365 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.235288 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:11.235365 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.235300 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:11.236675 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.236663 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:11.236821 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.236808 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.236850 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.236835 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:11.237357 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.237342 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:11.237410 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.237348 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:11.237410 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.237393 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:11.237410 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.237404 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:11.237498 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.237372 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:11.237498 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.237443 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:11.238881 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.238857 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.238968 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.238893 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:11.239493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.239477 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:11.239575 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.239505 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:11.239575 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.239517 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:11.270516 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.270494 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-187.ec2.internal\" not found" node="ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.274689 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.274672 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-187.ec2.internal\" not found" node="ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.284906 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.284892 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6a76448c86f51f1e7a46c54006967da8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal\" (UID: \"6a76448c86f51f1e7a46c54006967da8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.284974 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.284915 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a76448c86f51f1e7a46c54006967da8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal\" (UID: \"6a76448c86f51f1e7a46c54006967da8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.284974 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.284930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5dfa79b55f33a77023d71de337bc06ee-config\") pod \"kube-apiserver-proxy-ip-10-0-135-187.ec2.internal\" (UID: \"5dfa79b55f33a77023d71de337bc06ee\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.290737 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.290723 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.385540 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.385521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6a76448c86f51f1e7a46c54006967da8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal\" (UID: \"6a76448c86f51f1e7a46c54006967da8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.385596 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.385545 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a76448c86f51f1e7a46c54006967da8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal\" (UID: \"6a76448c86f51f1e7a46c54006967da8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.385596 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.385562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5dfa79b55f33a77023d71de337bc06ee-config\") pod \"kube-apiserver-proxy-ip-10-0-135-187.ec2.internal\" (UID: \"5dfa79b55f33a77023d71de337bc06ee\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.385596 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.385587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5dfa79b55f33a77023d71de337bc06ee-config\") pod \"kube-apiserver-proxy-ip-10-0-135-187.ec2.internal\" (UID: \"5dfa79b55f33a77023d71de337bc06ee\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.385678 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.385610 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6a76448c86f51f1e7a46c54006967da8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal\" (UID: \"6a76448c86f51f1e7a46c54006967da8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.385678 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.385623 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a76448c86f51f1e7a46c54006967da8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal\" (UID: \"6a76448c86f51f1e7a46c54006967da8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.391584 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.391569 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.492356 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.492296 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.573504 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.573480 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.576959 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.576944 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal" Apr 20 12:14:11.592918 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.592894 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.693396 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.693367 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.793966 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.793897 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.848222 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.848200 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:11.877602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.877583 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 12:14:11.877701 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.877687 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 12:14:11.877751 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.877735 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 12:14:11.877781 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.877737 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 12:14:11.894968 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:11.894945 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-187.ec2.internal\" not found" Apr 20 12:14:11.966487 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.966447 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 12:09:10 +0000 UTC" deadline="2027-12-21 15:22:09.755421003 +0000 UTC" Apr 20 12:14:11.966487 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.966483 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14643h7m57.788941506s" Apr 20 12:14:11.981498 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.981473 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 12:14:11.982826 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.982812 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:11.991253 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:11.991234 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 12:14:12.021762 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.021741 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gs7ld" Apr 20 12:14:12.029480 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.029463 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gs7ld" Apr 20 12:14:12.083012 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.082985 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" Apr 20 12:14:12.084347 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:12.084317 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dfa79b55f33a77023d71de337bc06ee.slice/crio-b379f2e78e3f5591a6f6a31e779b14a8817b2e24fbad97d8d189fbd096b56196 WatchSource:0}: Error finding container b379f2e78e3f5591a6f6a31e779b14a8817b2e24fbad97d8d189fbd096b56196: Status 404 returned error can't find the container with id b379f2e78e3f5591a6f6a31e779b14a8817b2e24fbad97d8d189fbd096b56196 Apr 20 12:14:12.084690 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:12.084667 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a76448c86f51f1e7a46c54006967da8.slice/crio-602b5f2fb9db68270797c3a677113c621fcf36d8f25571f70de666f8487e9f90 WatchSource:0}: Error finding container 602b5f2fb9db68270797c3a677113c621fcf36d8f25571f70de666f8487e9f90: Status 404 returned error can't find the container with id 602b5f2fb9db68270797c3a677113c621fcf36d8f25571f70de666f8487e9f90 Apr 20 12:14:12.088733 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.088717 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:14:12.094392 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.094375 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 12:14:12.095332 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.095314 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal" Apr 20 12:14:12.102263 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.102249 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 12:14:12.135849 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.135809 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" event={"ID":"6a76448c86f51f1e7a46c54006967da8","Type":"ContainerStarted","Data":"602b5f2fb9db68270797c3a677113c621fcf36d8f25571f70de666f8487e9f90"} Apr 20 12:14:12.138563 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.138519 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal" event={"ID":"5dfa79b55f33a77023d71de337bc06ee","Type":"ContainerStarted","Data":"b379f2e78e3f5591a6f6a31e779b14a8817b2e24fbad97d8d189fbd096b56196"} Apr 20 12:14:12.859875 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.859839 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:12.954111 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.954087 2570 apiserver.go:52] "Watching apiserver" Apr 20 12:14:12.962097 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.962075 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 12:14:12.963192 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.963171 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zbrg9","openshift-multus/multus-vrsq9","openshift-multus/network-metrics-daemon-8xfxk","kube-system/konnectivity-agent-bszzh","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc","openshift-network-diagnostics/network-check-target-hf6xh","openshift-network-operator/iptables-alerter-t7cr8","openshift-ovn-kubernetes/ovnkube-node-pw7dn","kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal","openshift-cluster-node-tuning-operator/tuned-9nxbw","openshift-dns/node-resolver-9zdrr","openshift-image-registry/node-ca-x4k8h","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal"] Apr 20 12:14:12.965072 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.965052 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:12.966154 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.966123 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.967565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.967420 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9mpdr\"" Apr 20 12:14:12.967565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.967436 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:12.967565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.967483 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 12:14:12.967565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.967503 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 12:14:12.967565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.967557 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:12.967894 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:12.967618 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:12.968517 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.968496 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gqd4g\"" Apr 20 12:14:12.968614 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.968532 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 12:14:12.968668 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.968630 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 12:14:12.968668 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.968635 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 12:14:12.968759 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.968714 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 12:14:12.969761 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.969742 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:12.970411 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.969919 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:12.970975 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.970958 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:12.971080 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:12.971045 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:12.972783 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.972233 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:12.972783 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.972272 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 12:14:12.972783 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.972358 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sxnmk\"" Apr 20 12:14:12.972783 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.972384 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 12:14:12.972783 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.972278 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 12:14:12.972783 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.972653 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 12:14:12.972783 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.972663 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zpmrc\"" Apr 20 12:14:12.972783 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.972699 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 12:14:12.973762 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.973744 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.974570 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.974551 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 12:14:12.974810 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.974793 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nngnt\"" Apr 20 12:14:12.974890 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.974849 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 12:14:12.975192 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.975173 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.975635 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.975613 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 12:14:12.976460 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.976441 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 12:14:12.976583 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.976549 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rjm22\"" Apr 20 12:14:12.976844 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.976825 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 12:14:12.977207 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.977186 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 12:14:12.977428 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.977407 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 12:14:12.977584 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.977553 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qsd9s\"" Apr 20 12:14:12.977749 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.977734 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 12:14:12.977945 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.977928 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:12.978145 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.978126 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 12:14:12.979042 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.978877 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:12.979398 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.979378 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:12.981030 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.980996 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rh5tl\"" Apr 20 12:14:12.981278 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.981261 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 12:14:12.981375 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.981290 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 12:14:12.981375 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.981297 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 12:14:12.982428 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.982207 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 12:14:12.982428 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.982293 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 12:14:12.982428 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.982376 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-w7txh\"" Apr 20 12:14:12.983856 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.983838 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 12:14:12.993327 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993309 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-sys-fs\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:12.993454 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993339 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-sys\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.993454 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993390 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-kubelet\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.993454 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993421 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-cni-bin\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.993454 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993451 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a627833c-c9d5-450e-a3be-cae5d2eed758-ovnkube-script-lib\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.993666 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993475 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-sysctl-conf\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.993666 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993501 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-socket-dir-parent\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.993666 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993540 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-run-openvswitch\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.993666 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993569 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-os-release\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:12.993666 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993591 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-socket-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:12.993666 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-kubernetes\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-systemd\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993697 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-run\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993723 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d45ac98c-ea8f-4ad5-8034-4c9981d9693a-konnectivity-ca\") pod \"konnectivity-agent-bszzh\" (UID: \"d45ac98c-ea8f-4ad5-8034-4c9981d9693a\") " pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-cni-dir\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993765 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-var-lib-kubelet\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993788 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-run-systemd\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993805 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-log-socket\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993842 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-cni-netd\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-conf-dir\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993898 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-etc-kubernetes\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.993938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993922 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-device-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993946 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3f8fe785-c3f9-4d97-9683-833f64ab21aa-serviceca\") pod \"node-ca-x4k8h\" (UID: \"3f8fe785-c3f9-4d97-9683-833f64ab21aa\") " pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993964 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.993983 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a627833c-c9d5-450e-a3be-cae5d2eed758-ovn-node-metrics-cert\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994041 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-registration-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994080 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/751a35e3-65ef-4efa-80f2-9cecd4a7b003-hosts-file\") pod \"node-resolver-9zdrr\" (UID: \"751a35e3-65ef-4efa-80f2-9cecd4a7b003\") " pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-systemd-units\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994131 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-cnibin\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-os-release\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994162 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-var-lib-cni-bin\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994187 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-etc-selinux\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994211 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-modprobe-d\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-sysctl-d\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994251 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:12.994491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994275 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994300 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a627833c-c9d5-450e-a3be-cae5d2eed758-ovnkube-config\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994322 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-var-lib-kubelet\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994345 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mcg6\" (UniqueName: \"kubernetes.io/projected/3f8fe785-c3f9-4d97-9683-833f64ab21aa-kube-api-access-4mcg6\") pod \"node-ca-x4k8h\" (UID: \"3f8fe785-c3f9-4d97-9683-833f64ab21aa\") " pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994369 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/865bf984-f3c6-4787-a88b-65144a2e4549-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-run-ovn\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994406 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-daemon-config\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994439 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-lib-modules\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994466 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/865bf984-f3c6-4787-a88b-65144a2e4549-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994483 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpjfh\" (UniqueName: \"kubernetes.io/projected/a627833c-c9d5-450e-a3be-cae5d2eed758-kube-api-access-mpjfh\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994507 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f08f930e-1834-40c7-9e3c-4cfd5402147c-cni-binary-copy\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994543 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-slash\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994571 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d4423a85-bbfb-4bda-a5c0-8729c2068a9b-iptables-alerter-script\") pod \"iptables-alerter-t7cr8\" (UID: \"d4423a85-bbfb-4bda-a5c0-8729c2068a9b\") " pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:12.995051 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994594 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4423a85-bbfb-4bda-a5c0-8729c2068a9b-host-slash\") pod \"iptables-alerter-t7cr8\" (UID: \"d4423a85-bbfb-4bda-a5c0-8729c2068a9b\") " pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994638 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/865bf984-f3c6-4787-a88b-65144a2e4549-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzmfx\" (UniqueName: \"kubernetes.io/projected/865bf984-f3c6-4787-a88b-65144a2e4549-kube-api-access-hzmfx\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994698 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-host\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994723 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzb4v\" (UniqueName: \"kubernetes.io/projected/d4423a85-bbfb-4bda-a5c0-8729c2068a9b-kube-api-access-wzb4v\") pod \"iptables-alerter-t7cr8\" (UID: \"d4423a85-bbfb-4bda-a5c0-8729c2068a9b\") " pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994748 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pkpr\" (UniqueName: \"kubernetes.io/projected/faa17428-5484-40d3-9fb5-b11e5a64f1be-kube-api-access-8pkpr\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994797 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-cnibin\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-var-lib-openvswitch\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994892 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-hostroot\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-run-multus-certs\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-tuned\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994975 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c68b8adf-d528-45fe-9c38-cc642642a5aa-tmp\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.994994 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-etc-openvswitch\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.995565 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995042 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-run-ovn-kubernetes\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995080 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-var-lib-cni-multus\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995129 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mk9k\" (UniqueName: \"kubernetes.io/projected/a9b88536-8286-4c7c-8747-3f37573024a1-kube-api-access-8mk9k\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995153 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/751a35e3-65ef-4efa-80f2-9cecd4a7b003-tmp-dir\") pod \"node-resolver-9zdrr\" (UID: \"751a35e3-65ef-4efa-80f2-9cecd4a7b003\") " pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995180 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-run-netns\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995203 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-system-cni-dir\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a627833c-c9d5-450e-a3be-cae5d2eed758-env-overrides\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995246 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-run-k8s-cni-cncf-io\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995293 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-sysconfig\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995339 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h77qq\" (UniqueName: \"kubernetes.io/projected/c68b8adf-d528-45fe-9c38-cc642642a5aa-kube-api-access-h77qq\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995358 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d45ac98c-ea8f-4ad5-8034-4c9981d9693a-agent-certs\") pod \"konnectivity-agent-bszzh\" (UID: \"d45ac98c-ea8f-4ad5-8034-4c9981d9693a\") " pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995390 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-run-netns\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995417 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9ggh\" (UniqueName: \"kubernetes.io/projected/751a35e3-65ef-4efa-80f2-9cecd4a7b003-kube-api-access-n9ggh\") pod \"node-resolver-9zdrr\" (UID: \"751a35e3-65ef-4efa-80f2-9cecd4a7b003\") " pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:12.996219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995439 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f8fe785-c3f9-4d97-9683-833f64ab21aa-host\") pod \"node-ca-x4k8h\" (UID: \"3f8fe785-c3f9-4d97-9683-833f64ab21aa\") " pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:12.996834 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995458 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-node-log\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:12.996834 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995479 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-system-cni-dir\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.996834 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995508 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t9nb\" (UniqueName: \"kubernetes.io/projected/f08f930e-1834-40c7-9e3c-4cfd5402147c-kube-api-access-4t9nb\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:12.996834 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.995564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:12.999946 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:12.999931 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:13.030601 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.030577 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 12:09:12 +0000 UTC" deadline="2027-10-28 19:01:06.435527804 +0000 UTC" Apr 20 12:14:13.030601 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.030600 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13350h46m53.40493092s" Apr 20 12:14:13.096684 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096658 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-cni-netd\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.096810 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-conf-dir\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.096810 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-etc-kubernetes\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.096810 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096747 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-device-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.096810 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096773 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-conf-dir\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.096810 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096798 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-device-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.096810 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096774 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-cni-netd\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096819 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-etc-kubernetes\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096872 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3f8fe785-c3f9-4d97-9683-833f64ab21aa-serviceca\") pod \"node-ca-x4k8h\" (UID: \"3f8fe785-c3f9-4d97-9683-833f64ab21aa\") " pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096914 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a627833c-c9d5-450e-a3be-cae5d2eed758-ovn-node-metrics-cert\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096969 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-registration-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.096994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/751a35e3-65ef-4efa-80f2-9cecd4a7b003-hosts-file\") pod \"node-resolver-9zdrr\" (UID: \"751a35e3-65ef-4efa-80f2-9cecd4a7b003\") " pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097034 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-systemd-units\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-cnibin\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097081 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-os-release\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.097139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-var-lib-cni-bin\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097144 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-systemd-units\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-var-lib-cni-bin\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-etc-selinux\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097189 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-modprobe-d\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097205 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-sysctl-d\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097220 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097234 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-os-release\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097258 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097266 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a627833c-c9d5-450e-a3be-cae5d2eed758-ovnkube-config\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097291 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-var-lib-kubelet\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097317 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mcg6\" (UniqueName: \"kubernetes.io/projected/3f8fe785-c3f9-4d97-9683-833f64ab21aa-kube-api-access-4mcg6\") pod \"node-ca-x4k8h\" (UID: \"3f8fe785-c3f9-4d97-9683-833f64ab21aa\") " pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097347 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/865bf984-f3c6-4787-a88b-65144a2e4549-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097337 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 12:14:13.097630 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-run-ovn\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097294 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-registration-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097400 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-daemon-config\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-etc-selinux\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097424 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-lib-modules\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097455 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-cnibin\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097464 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/865bf984-f3c6-4787-a88b-65144a2e4549-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpjfh\" (UniqueName: \"kubernetes.io/projected/a627833c-c9d5-450e-a3be-cae5d2eed758-kube-api-access-mpjfh\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097514 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f08f930e-1834-40c7-9e3c-4cfd5402147c-cni-binary-copy\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097538 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-slash\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097563 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d4423a85-bbfb-4bda-a5c0-8729c2068a9b-iptables-alerter-script\") pod \"iptables-alerter-t7cr8\" (UID: \"d4423a85-bbfb-4bda-a5c0-8729c2068a9b\") " pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-sysctl-d\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097592 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4423a85-bbfb-4bda-a5c0-8729c2068a9b-host-slash\") pod \"iptables-alerter-t7cr8\" (UID: \"d4423a85-bbfb-4bda-a5c0-8729c2068a9b\") " pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097621 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/865bf984-f3c6-4787-a88b-65144a2e4549-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.098311 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzmfx\" (UniqueName: \"kubernetes.io/projected/865bf984-f3c6-4787-a88b-65144a2e4549-kube-api-access-hzmfx\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-host\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzb4v\" (UniqueName: \"kubernetes.io/projected/d4423a85-bbfb-4bda-a5c0-8729c2068a9b-kube-api-access-wzb4v\") pod \"iptables-alerter-t7cr8\" (UID: \"d4423a85-bbfb-4bda-a5c0-8729c2068a9b\") " pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097723 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pkpr\" (UniqueName: \"kubernetes.io/projected/faa17428-5484-40d3-9fb5-b11e5a64f1be-kube-api-access-8pkpr\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097747 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-cnibin\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3f8fe785-c3f9-4d97-9683-833f64ab21aa-serviceca\") pod \"node-ca-x4k8h\" (UID: \"3f8fe785-c3f9-4d97-9683-833f64ab21aa\") " pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097773 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-var-lib-openvswitch\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-hostroot\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097824 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-run-multus-certs\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-tuned\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-var-lib-kubelet\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097871 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c68b8adf-d528-45fe-9c38-cc642642a5aa-tmp\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097897 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-etc-openvswitch\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097921 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-run-ovn-kubernetes\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-var-lib-cni-multus\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.098930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097977 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a627833c-c9d5-450e-a3be-cae5d2eed758-ovnkube-config\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-modprobe-d\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mk9k\" (UniqueName: \"kubernetes.io/projected/a9b88536-8286-4c7c-8747-3f37573024a1-kube-api-access-8mk9k\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-slash\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098103 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4423a85-bbfb-4bda-a5c0-8729c2068a9b-host-slash\") pod \"iptables-alerter-t7cr8\" (UID: \"d4423a85-bbfb-4bda-a5c0-8729c2068a9b\") " pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098117 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-cnibin\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098220 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/865bf984-f3c6-4787-a88b-65144a2e4549-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098276 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-run-ovn\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098320 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/751a35e3-65ef-4efa-80f2-9cecd4a7b003-tmp-dir\") pod \"node-resolver-9zdrr\" (UID: \"751a35e3-65ef-4efa-80f2-9cecd4a7b003\") " pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098347 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-etc-openvswitch\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098417 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-host\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098455 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-var-lib-openvswitch\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098474 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-lib-modules\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098489 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-hostroot\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098524 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-run-multus-certs\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.099562 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098528 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f08f930e-1834-40c7-9e3c-4cfd5402147c-cni-binary-copy\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098557 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-var-lib-cni-multus\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098591 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-run-ovn-kubernetes\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098665 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d4423a85-bbfb-4bda-a5c0-8729c2068a9b-iptables-alerter-script\") pod \"iptables-alerter-t7cr8\" (UID: \"d4423a85-bbfb-4bda-a5c0-8729c2068a9b\") " pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-run-netns\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098764 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/865bf984-f3c6-4787-a88b-65144a2e4549-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.097364 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/751a35e3-65ef-4efa-80f2-9cecd4a7b003-hosts-file\") pod \"node-resolver-9zdrr\" (UID: \"751a35e3-65ef-4efa-80f2-9cecd4a7b003\") " pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098830 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-system-cni-dir\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a627833c-c9d5-450e-a3be-cae5d2eed758-env-overrides\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-run-k8s-cni-cncf-io\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-sysconfig\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h77qq\" (UniqueName: \"kubernetes.io/projected/c68b8adf-d528-45fe-9c38-cc642642a5aa-kube-api-access-h77qq\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/865bf984-f3c6-4787-a88b-65144a2e4549-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.098977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:13.100245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099008 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/751a35e3-65ef-4efa-80f2-9cecd4a7b003-tmp-dir\") pod \"node-resolver-9zdrr\" (UID: \"751a35e3-65ef-4efa-80f2-9cecd4a7b003\") " pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099074 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-run-netns\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d45ac98c-ea8f-4ad5-8034-4c9981d9693a-agent-certs\") pod \"konnectivity-agent-bszzh\" (UID: \"d45ac98c-ea8f-4ad5-8034-4c9981d9693a\") " pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099110 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-system-cni-dir\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099114 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-run-netns\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099141 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-run-k8s-cni-cncf-io\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9ggh\" (UniqueName: \"kubernetes.io/projected/751a35e3-65ef-4efa-80f2-9cecd4a7b003-kube-api-access-n9ggh\") pod \"node-resolver-9zdrr\" (UID: \"751a35e3-65ef-4efa-80f2-9cecd4a7b003\") " pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099200 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-daemon-config\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f8fe785-c3f9-4d97-9683-833f64ab21aa-host\") pod \"node-ca-x4k8h\" (UID: \"3f8fe785-c3f9-4d97-9683-833f64ab21aa\") " pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099311 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-node-log\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099321 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-host-run-netns\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099349 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a627833c-c9d5-450e-a3be-cae5d2eed758-env-overrides\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-sysconfig\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099408 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-system-cni-dir\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4t9nb\" (UniqueName: \"kubernetes.io/projected/f08f930e-1834-40c7-9e3c-4cfd5402147c-kube-api-access-4t9nb\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.100867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099440 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-node-log\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-sys-fs\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099533 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f8fe785-c3f9-4d97-9683-833f64ab21aa-host\") pod \"node-ca-x4k8h\" (UID: \"3f8fe785-c3f9-4d97-9683-833f64ab21aa\") " pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099533 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099574 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-sys\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099603 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-kubelet\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-cni-bin\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099657 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a627833c-c9d5-450e-a3be-cae5d2eed758-ovnkube-script-lib\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.099676 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099684 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-sysctl-conf\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099695 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-sys\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099709 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-socket-dir-parent\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.099745 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs podName:faa17428-5484-40d3-9fb5-b11e5a64f1be nodeName:}" failed. No retries permitted until 2026-04-20 12:14:13.599726111 +0000 UTC m=+3.103285156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs") pod "network-metrics-daemon-8xfxk" (UID: "faa17428-5484-40d3-9fb5-b11e5a64f1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:13.101493 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099765 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-run-openvswitch\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099789 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-socket-dir-parent\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099826 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-kubelet\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-sys-fs\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-sysctl-conf\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099660 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-host-cni-bin\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.099984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-os-release\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100013 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-run-openvswitch\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-socket-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100091 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/865bf984-f3c6-4787-a88b-65144a2e4549-os-release\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100131 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-kubernetes\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-systemd\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100184 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-run\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d45ac98c-ea8f-4ad5-8034-4c9981d9693a-konnectivity-ca\") pod \"konnectivity-agent-bszzh\" (UID: \"d45ac98c-ea8f-4ad5-8034-4c9981d9693a\") " pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100239 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9b88536-8286-4c7c-8747-3f37573024a1-socket-dir\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.102049 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100272 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-cni-dir\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-systemd\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100305 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-var-lib-kubelet\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100305 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a627833c-c9d5-450e-a3be-cae5d2eed758-ovnkube-script-lib\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100319 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-run\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100362 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-var-lib-kubelet\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-run-systemd\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100412 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-multus-cni-dir\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100437 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-log-socket\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100488 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-log-socket\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100535 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a627833c-c9d5-450e-a3be-cae5d2eed758-run-systemd\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100570 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-kubernetes\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100663 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f08f930e-1834-40c7-9e3c-4cfd5402147c-system-cni-dir\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.100734 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d45ac98c-ea8f-4ad5-8034-4c9981d9693a-konnectivity-ca\") pod \"konnectivity-agent-bszzh\" (UID: \"d45ac98c-ea8f-4ad5-8034-4c9981d9693a\") " pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.101084 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c68b8adf-d528-45fe-9c38-cc642642a5aa-tmp\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.102602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.101129 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c68b8adf-d528-45fe-9c38-cc642642a5aa-etc-tuned\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.103283 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.101350 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a627833c-c9d5-450e-a3be-cae5d2eed758-ovn-node-metrics-cert\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.103283 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.102101 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d45ac98c-ea8f-4ad5-8034-4c9981d9693a-agent-certs\") pod \"konnectivity-agent-bszzh\" (UID: \"d45ac98c-ea8f-4ad5-8034-4c9981d9693a\") " pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:13.104713 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.104116 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:13.104713 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.104138 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:13.104713 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.104151 2570 projected.go:194] Error preparing data for projected volume kube-api-access-ldmgk for pod openshift-network-diagnostics/network-check-target-hf6xh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:13.104713 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.104206 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk podName:6b4ca455-a400-4bf3-8bd0-0b93d1456970 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:13.604189942 +0000 UTC m=+3.107748984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ldmgk" (UniqueName: "kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk") pod "network-check-target-hf6xh" (UID: "6b4ca455-a400-4bf3-8bd0-0b93d1456970") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:13.107005 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.106916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mk9k\" (UniqueName: \"kubernetes.io/projected/a9b88536-8286-4c7c-8747-3f37573024a1-kube-api-access-8mk9k\") pod \"aws-ebs-csi-driver-node-gzwxc\" (UID: \"a9b88536-8286-4c7c-8747-3f37573024a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.107892 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.107775 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzb4v\" (UniqueName: \"kubernetes.io/projected/d4423a85-bbfb-4bda-a5c0-8729c2068a9b-kube-api-access-wzb4v\") pod \"iptables-alerter-t7cr8\" (UID: \"d4423a85-bbfb-4bda-a5c0-8729c2068a9b\") " pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:13.107892 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.107778 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mcg6\" (UniqueName: \"kubernetes.io/projected/3f8fe785-c3f9-4d97-9683-833f64ab21aa-kube-api-access-4mcg6\") pod \"node-ca-x4k8h\" (UID: \"3f8fe785-c3f9-4d97-9683-833f64ab21aa\") " pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:13.107892 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.107818 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pkpr\" (UniqueName: \"kubernetes.io/projected/faa17428-5484-40d3-9fb5-b11e5a64f1be-kube-api-access-8pkpr\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:13.107892 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.107783 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzmfx\" (UniqueName: \"kubernetes.io/projected/865bf984-f3c6-4787-a88b-65144a2e4549-kube-api-access-hzmfx\") pod \"multus-additional-cni-plugins-zbrg9\" (UID: \"865bf984-f3c6-4787-a88b-65144a2e4549\") " pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.108202 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.108177 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpjfh\" (UniqueName: \"kubernetes.io/projected/a627833c-c9d5-450e-a3be-cae5d2eed758-kube-api-access-mpjfh\") pod \"ovnkube-node-pw7dn\" (UID: \"a627833c-c9d5-450e-a3be-cae5d2eed758\") " pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.109390 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.109367 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9ggh\" (UniqueName: \"kubernetes.io/projected/751a35e3-65ef-4efa-80f2-9cecd4a7b003-kube-api-access-n9ggh\") pod \"node-resolver-9zdrr\" (UID: \"751a35e3-65ef-4efa-80f2-9cecd4a7b003\") " pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:13.109854 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.109816 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h77qq\" (UniqueName: \"kubernetes.io/projected/c68b8adf-d528-45fe-9c38-cc642642a5aa-kube-api-access-h77qq\") pod \"tuned-9nxbw\" (UID: \"c68b8adf-d528-45fe-9c38-cc642642a5aa\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.112754 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.112704 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t9nb\" (UniqueName: \"kubernetes.io/projected/f08f930e-1834-40c7-9e3c-4cfd5402147c-kube-api-access-4t9nb\") pod \"multus-vrsq9\" (UID: \"f08f930e-1834-40c7-9e3c-4cfd5402147c\") " pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.279147 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.279114 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t7cr8" Apr 20 12:14:13.286988 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.286966 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vrsq9" Apr 20 12:14:13.294601 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.294577 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:13.300271 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.300247 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" Apr 20 12:14:13.306882 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.306863 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" Apr 20 12:14:13.313489 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.313464 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:13.319867 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.319841 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" Apr 20 12:14:13.325368 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.325348 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x4k8h" Apr 20 12:14:13.329786 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.329769 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9zdrr" Apr 20 12:14:13.570862 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.570785 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:13.604935 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.604900 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:13.605120 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:13.604975 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:13.605120 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.605004 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:13.605120 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.605080 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs podName:faa17428-5484-40d3-9fb5-b11e5a64f1be nodeName:}" failed. No retries permitted until 2026-04-20 12:14:14.605064025 +0000 UTC m=+4.108623067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs") pod "network-metrics-daemon-8xfxk" (UID: "faa17428-5484-40d3-9fb5-b11e5a64f1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:13.605120 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.605109 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:13.605333 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.605127 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:13.605333 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.605140 2570 projected.go:194] Error preparing data for projected volume kube-api-access-ldmgk for pod openshift-network-diagnostics/network-check-target-hf6xh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:13.605333 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:13.605196 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk podName:6b4ca455-a400-4bf3-8bd0-0b93d1456970 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:14.605179242 +0000 UTC m=+4.108738300 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldmgk" (UniqueName: "kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk") pod "network-check-target-hf6xh" (UID: "6b4ca455-a400-4bf3-8bd0-0b93d1456970") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:13.646243 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:13.646213 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9b88536_8286_4c7c_8747_3f37573024a1.slice/crio-9058a0449f79c12b531da2841feafcae47697bdad2d669c51219e8fcb8370c2f WatchSource:0}: Error finding container 9058a0449f79c12b531da2841feafcae47697bdad2d669c51219e8fcb8370c2f: Status 404 returned error can't find the container with id 9058a0449f79c12b531da2841feafcae47697bdad2d669c51219e8fcb8370c2f Apr 20 12:14:13.647804 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:13.647784 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd45ac98c_ea8f_4ad5_8034_4c9981d9693a.slice/crio-8670e403e69c737647ed2443ac0b9f4efa99fb8cf1ca6c1faffe8b4ae91cfede WatchSource:0}: Error finding container 8670e403e69c737647ed2443ac0b9f4efa99fb8cf1ca6c1faffe8b4ae91cfede: Status 404 returned error can't find the container with id 8670e403e69c737647ed2443ac0b9f4efa99fb8cf1ca6c1faffe8b4ae91cfede Apr 20 12:14:13.649028 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:13.648910 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865bf984_f3c6_4787_a88b_65144a2e4549.slice/crio-e839768971ebbb6679293c0373d7474b04120dcc5c8254eb6c2b380cf2976f70 WatchSource:0}: Error finding container e839768971ebbb6679293c0373d7474b04120dcc5c8254eb6c2b380cf2976f70: Status 404 returned error can't find the container with id e839768971ebbb6679293c0373d7474b04120dcc5c8254eb6c2b380cf2976f70 Apr 20 12:14:13.650500 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:13.650269 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda627833c_c9d5_450e_a3be_cae5d2eed758.slice/crio-c59092da3b1c13f75095561269b13498e33aaf020a9024c9e4b063e52a579a69 WatchSource:0}: Error finding container c59092da3b1c13f75095561269b13498e33aaf020a9024c9e4b063e52a579a69: Status 404 returned error can't find the container with id c59092da3b1c13f75095561269b13498e33aaf020a9024c9e4b063e52a579a69 Apr 20 12:14:13.651964 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:13.651839 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68b8adf_d528_45fe_9c38_cc642642a5aa.slice/crio-578d930c4ed00592fad6b3ffdd88293af170b188ef4c367ca162629cca271434 WatchSource:0}: Error finding container 578d930c4ed00592fad6b3ffdd88293af170b188ef4c367ca162629cca271434: Status 404 returned error can't find the container with id 578d930c4ed00592fad6b3ffdd88293af170b188ef4c367ca162629cca271434 Apr 20 12:14:13.652981 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:13.652957 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f8fe785_c3f9_4d97_9683_833f64ab21aa.slice/crio-f4e33a536d8e32a1e58805b7b300aab55326787adcc70645e3983cd6e32e8332 WatchSource:0}: Error finding container f4e33a536d8e32a1e58805b7b300aab55326787adcc70645e3983cd6e32e8332: Status 404 returned error can't find the container with id f4e33a536d8e32a1e58805b7b300aab55326787adcc70645e3983cd6e32e8332 Apr 20 12:14:13.653874 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:13.653548 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod751a35e3_65ef_4efa_80f2_9cecd4a7b003.slice/crio-34318790e817087123411da2a9cb9f66c69a7498b4d96565f8c6d7594a9346c0 WatchSource:0}: Error finding container 34318790e817087123411da2a9cb9f66c69a7498b4d96565f8c6d7594a9346c0: Status 404 returned error can't find the container with id 34318790e817087123411da2a9cb9f66c69a7498b4d96565f8c6d7594a9346c0 Apr 20 12:14:13.655006 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:13.654934 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf08f930e_1834_40c7_9e3c_4cfd5402147c.slice/crio-2cc07ae2d6bd3a771f105de74056cee2ab710fffb498b5d5a4cf80c8e27030a6 WatchSource:0}: Error finding container 2cc07ae2d6bd3a771f105de74056cee2ab710fffb498b5d5a4cf80c8e27030a6: Status 404 returned error can't find the container with id 2cc07ae2d6bd3a771f105de74056cee2ab710fffb498b5d5a4cf80c8e27030a6 Apr 20 12:14:13.655873 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:13.655806 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4423a85_bbfb_4bda_a5c0_8729c2068a9b.slice/crio-bfc52dc709e52b4158bb7d8dd9be0237ec7733edff124a4fe46b69fd7685245b WatchSource:0}: Error finding container bfc52dc709e52b4158bb7d8dd9be0237ec7733edff124a4fe46b69fd7685245b: Status 404 returned error can't find the container with id bfc52dc709e52b4158bb7d8dd9be0237ec7733edff124a4fe46b69fd7685245b Apr 20 12:14:14.031757 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.031667 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 12:09:12 +0000 UTC" deadline="2027-12-10 21:55:30.904344006 +0000 UTC" Apr 20 12:14:14.031757 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.031695 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14385h41m16.872652387s" Apr 20 12:14:14.134453 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.134420 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:14.134589 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:14.134551 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:14.147396 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.147365 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerStarted","Data":"e839768971ebbb6679293c0373d7474b04120dcc5c8254eb6c2b380cf2976f70"} Apr 20 12:14:14.149080 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.149052 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" event={"ID":"a9b88536-8286-4c7c-8747-3f37573024a1","Type":"ContainerStarted","Data":"9058a0449f79c12b531da2841feafcae47697bdad2d669c51219e8fcb8370c2f"} Apr 20 12:14:14.150394 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.150368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vrsq9" event={"ID":"f08f930e-1834-40c7-9e3c-4cfd5402147c","Type":"ContainerStarted","Data":"2cc07ae2d6bd3a771f105de74056cee2ab710fffb498b5d5a4cf80c8e27030a6"} Apr 20 12:14:14.151714 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.151691 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" event={"ID":"a627833c-c9d5-450e-a3be-cae5d2eed758","Type":"ContainerStarted","Data":"c59092da3b1c13f75095561269b13498e33aaf020a9024c9e4b063e52a579a69"} Apr 20 12:14:14.159131 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.159090 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bszzh" event={"ID":"d45ac98c-ea8f-4ad5-8034-4c9981d9693a","Type":"ContainerStarted","Data":"8670e403e69c737647ed2443ac0b9f4efa99fb8cf1ca6c1faffe8b4ae91cfede"} Apr 20 12:14:14.172047 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.172010 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal" event={"ID":"5dfa79b55f33a77023d71de337bc06ee","Type":"ContainerStarted","Data":"d08dbb2ffc61b2d63eb2fbd5879b5b07f7c668d9661cb0f91233b34582d6b436"} Apr 20 12:14:14.174739 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.174714 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t7cr8" event={"ID":"d4423a85-bbfb-4bda-a5c0-8729c2068a9b","Type":"ContainerStarted","Data":"bfc52dc709e52b4158bb7d8dd9be0237ec7733edff124a4fe46b69fd7685245b"} Apr 20 12:14:14.176588 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.176548 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9zdrr" event={"ID":"751a35e3-65ef-4efa-80f2-9cecd4a7b003","Type":"ContainerStarted","Data":"34318790e817087123411da2a9cb9f66c69a7498b4d96565f8c6d7594a9346c0"} Apr 20 12:14:14.186171 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.186149 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x4k8h" event={"ID":"3f8fe785-c3f9-4d97-9683-833f64ab21aa","Type":"ContainerStarted","Data":"f4e33a536d8e32a1e58805b7b300aab55326787adcc70645e3983cd6e32e8332"} Apr 20 12:14:14.192042 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.192004 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" event={"ID":"c68b8adf-d528-45fe-9c38-cc642642a5aa","Type":"ContainerStarted","Data":"578d930c4ed00592fad6b3ffdd88293af170b188ef4c367ca162629cca271434"} Apr 20 12:14:14.615085 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.613888 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:14.615085 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:14.613954 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:14.615085 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:14.614103 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:14.615085 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:14.614123 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:14.615085 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:14.614135 2570 projected.go:194] Error preparing data for projected volume kube-api-access-ldmgk for pod openshift-network-diagnostics/network-check-target-hf6xh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:14.615085 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:14.614192 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk podName:6b4ca455-a400-4bf3-8bd0-0b93d1456970 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:16.614172807 +0000 UTC m=+6.117731865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldmgk" (UniqueName: "kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk") pod "network-check-target-hf6xh" (UID: "6b4ca455-a400-4bf3-8bd0-0b93d1456970") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:14.615085 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:14.614261 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:14.615085 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:14.614294 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs podName:faa17428-5484-40d3-9fb5-b11e5a64f1be nodeName:}" failed. No retries permitted until 2026-04-20 12:14:16.614283383 +0000 UTC m=+6.117842432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs") pod "network-metrics-daemon-8xfxk" (UID: "faa17428-5484-40d3-9fb5-b11e5a64f1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:15.144166 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:15.144139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:15.144593 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:15.144297 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:15.206804 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:15.206207 2570 generic.go:358] "Generic (PLEG): container finished" podID="6a76448c86f51f1e7a46c54006967da8" containerID="d406e130a299fa92c0f36db2954111a05ac32718907cf5d0de54edadfa858395" exitCode=0 Apr 20 12:14:15.206804 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:15.206314 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" event={"ID":"6a76448c86f51f1e7a46c54006967da8","Type":"ContainerDied","Data":"d406e130a299fa92c0f36db2954111a05ac32718907cf5d0de54edadfa858395"} Apr 20 12:14:15.221849 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:15.221800 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-187.ec2.internal" podStartSLOduration=3.221784134 podStartE2EDuration="3.221784134s" podCreationTimestamp="2026-04-20 12:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:14:14.18831565 +0000 UTC m=+3.691874785" watchObservedRunningTime="2026-04-20 12:14:15.221784134 +0000 UTC m=+4.725343216" Apr 20 12:14:16.134647 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:16.134614 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:16.134906 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:16.134756 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:16.210970 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:16.210933 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" event={"ID":"6a76448c86f51f1e7a46c54006967da8","Type":"ContainerStarted","Data":"65ba9f58e04f8db37a7ae73d605efe210ffc0e37bcf2e8ad8871d24e75cba0c8"} Apr 20 12:14:16.227618 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:16.226560 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-187.ec2.internal" podStartSLOduration=4.226543854 podStartE2EDuration="4.226543854s" podCreationTimestamp="2026-04-20 12:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:14:16.226194834 +0000 UTC m=+5.729753927" watchObservedRunningTime="2026-04-20 12:14:16.226543854 +0000 UTC m=+5.730102918" Apr 20 12:14:16.631572 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:16.631532 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:16.631730 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:16.631597 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:16.631800 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:16.631744 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:16.631852 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:16.631812 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs podName:faa17428-5484-40d3-9fb5-b11e5a64f1be nodeName:}" failed. No retries permitted until 2026-04-20 12:14:20.631791739 +0000 UTC m=+10.135350810 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs") pod "network-metrics-daemon-8xfxk" (UID: "faa17428-5484-40d3-9fb5-b11e5a64f1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:16.632431 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:16.632308 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:16.632431 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:16.632328 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:16.632431 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:16.632340 2570 projected.go:194] Error preparing data for projected volume kube-api-access-ldmgk for pod openshift-network-diagnostics/network-check-target-hf6xh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:16.632431 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:16.632400 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk podName:6b4ca455-a400-4bf3-8bd0-0b93d1456970 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:20.632384393 +0000 UTC m=+10.135943437 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldmgk" (UniqueName: "kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk") pod "network-check-target-hf6xh" (UID: "6b4ca455-a400-4bf3-8bd0-0b93d1456970") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:17.135079 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:17.135048 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:17.135308 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:17.135186 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:18.133777 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:18.133738 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:18.134258 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:18.133894 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:19.133975 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:19.133944 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:19.134372 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:19.134093 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:20.133831 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:20.133798 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:20.133995 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:20.133936 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:20.665013 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:20.664977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:20.665458 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:20.665057 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:20.665458 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:20.665170 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:20.665458 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:20.665227 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs podName:faa17428-5484-40d3-9fb5-b11e5a64f1be nodeName:}" failed. No retries permitted until 2026-04-20 12:14:28.665209407 +0000 UTC m=+18.168768456 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs") pod "network-metrics-daemon-8xfxk" (UID: "faa17428-5484-40d3-9fb5-b11e5a64f1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:20.665624 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:20.665529 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:20.665624 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:20.665550 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:20.665624 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:20.665562 2570 projected.go:194] Error preparing data for projected volume kube-api-access-ldmgk for pod openshift-network-diagnostics/network-check-target-hf6xh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:20.665624 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:20.665616 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk podName:6b4ca455-a400-4bf3-8bd0-0b93d1456970 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:28.665600516 +0000 UTC m=+18.169159563 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldmgk" (UniqueName: "kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk") pod "network-check-target-hf6xh" (UID: "6b4ca455-a400-4bf3-8bd0-0b93d1456970") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:21.135279 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:21.135198 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:21.135420 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:21.135334 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:22.134470 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:22.134397 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:22.134896 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:22.134542 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:23.134709 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:23.134673 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:23.135173 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:23.134812 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:24.133995 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:24.133958 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:24.134173 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:24.134105 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:25.133936 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:25.133905 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:25.134407 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:25.134032 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:26.133835 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:26.133781 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:26.134015 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:26.133913 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:27.133972 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.133932 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:27.134168 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:27.134063 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:27.228602 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.228576 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2zqds"] Apr 20 12:14:27.264617 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.264588 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.264767 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:27.264674 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:27.315378 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.315344 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.315540 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.315413 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/32b53c98-449d-4d6d-9ec4-06d42d60860e-kubelet-config\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.315540 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.315487 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/32b53c98-449d-4d6d-9ec4-06d42d60860e-dbus\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.416856 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.416773 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/32b53c98-449d-4d6d-9ec4-06d42d60860e-dbus\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.416856 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.416853 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.417100 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.416898 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/32b53c98-449d-4d6d-9ec4-06d42d60860e-kubelet-config\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.417100 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.416970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/32b53c98-449d-4d6d-9ec4-06d42d60860e-dbus\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.417100 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.416976 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/32b53c98-449d-4d6d-9ec4-06d42d60860e-kubelet-config\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.417100 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:27.417007 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:27.417100 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:27.417096 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret podName:32b53c98-449d-4d6d-9ec4-06d42d60860e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:27.917076189 +0000 UTC m=+17.420635236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret") pod "global-pull-secret-syncer-2zqds" (UID: "32b53c98-449d-4d6d-9ec4-06d42d60860e") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:27.920431 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:27.920401 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:27.920648 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:27.920554 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:27.920648 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:27.920625 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret podName:32b53c98-449d-4d6d-9ec4-06d42d60860e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:28.920609801 +0000 UTC m=+18.424168842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret") pod "global-pull-secret-syncer-2zqds" (UID: "32b53c98-449d-4d6d-9ec4-06d42d60860e") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:28.134323 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:28.134291 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:28.134762 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:28.134429 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:28.726129 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:28.726100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:28.726288 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:28.726150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:28.726288 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:28.726253 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:28.726288 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:28.726264 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:28.726288 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:28.726281 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:28.726441 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:28.726294 2570 projected.go:194] Error preparing data for projected volume kube-api-access-ldmgk for pod openshift-network-diagnostics/network-check-target-hf6xh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:28.726441 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:28.726319 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs podName:faa17428-5484-40d3-9fb5-b11e5a64f1be nodeName:}" failed. No retries permitted until 2026-04-20 12:14:44.726300559 +0000 UTC m=+34.229859602 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs") pod "network-metrics-daemon-8xfxk" (UID: "faa17428-5484-40d3-9fb5-b11e5a64f1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:28.726441 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:28.726339 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk podName:6b4ca455-a400-4bf3-8bd0-0b93d1456970 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:44.726324921 +0000 UTC m=+34.229883966 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldmgk" (UniqueName: "kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk") pod "network-check-target-hf6xh" (UID: "6b4ca455-a400-4bf3-8bd0-0b93d1456970") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:28.927163 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:28.927123 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:28.927331 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:28.927253 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:28.927331 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:28.927311 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret podName:32b53c98-449d-4d6d-9ec4-06d42d60860e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:30.927293718 +0000 UTC m=+20.430852759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret") pod "global-pull-secret-syncer-2zqds" (UID: "32b53c98-449d-4d6d-9ec4-06d42d60860e") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:29.134098 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:29.133962 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:29.134261 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:29.134100 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:29.134261 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:29.134197 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:29.134374 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:29.134302 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:30.134287 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:30.134256 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:30.134441 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:30.134384 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:30.940827 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:30.940502 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:30.940912 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:30.940655 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:30.940974 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:30.940960 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret podName:32b53c98-449d-4d6d-9ec4-06d42d60860e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:34.940932703 +0000 UTC m=+24.444491746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret") pod "global-pull-secret-syncer-2zqds" (UID: "32b53c98-449d-4d6d-9ec4-06d42d60860e") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:31.135273 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.135248 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:31.135917 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:31.135400 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:31.135917 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.135415 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:31.135917 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:31.135502 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:31.236981 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.236957 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9zdrr" event={"ID":"751a35e3-65ef-4efa-80f2-9cecd4a7b003","Type":"ContainerStarted","Data":"ad9806d0afce3fdaaea47097511640ac590cf2fd06120b98737ae39b9a371376"} Apr 20 12:14:31.238362 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.238327 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x4k8h" event={"ID":"3f8fe785-c3f9-4d97-9683-833f64ab21aa","Type":"ContainerStarted","Data":"399bba98fd5dabf2898e39fa6dcca2b80240cc2c097cf1ac8449356ffb5ab306"} Apr 20 12:14:31.240351 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.240325 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" event={"ID":"c68b8adf-d528-45fe-9c38-cc642642a5aa","Type":"ContainerStarted","Data":"b430bcf47c29a06c71047b9519eda934ece873e5f5ed7d78a8e3567efe6212c4"} Apr 20 12:14:31.242571 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.242546 2570 generic.go:358] "Generic (PLEG): container finished" podID="865bf984-f3c6-4787-a88b-65144a2e4549" containerID="e51cc15c7a6a70a5d281ea34cb9dcfb61959dbb16eee63e9472874845e4d9d36" exitCode=0 Apr 20 12:14:31.242670 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.242590 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerDied","Data":"e51cc15c7a6a70a5d281ea34cb9dcfb61959dbb16eee63e9472874845e4d9d36"} Apr 20 12:14:31.247282 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.247249 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" event={"ID":"a9b88536-8286-4c7c-8747-3f37573024a1","Type":"ContainerStarted","Data":"94cd7a595700a9562a5642df7698b12b257b6a3629c30bb0fc22e1ce78719eb7"} Apr 20 12:14:31.248757 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.248740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vrsq9" event={"ID":"f08f930e-1834-40c7-9e3c-4cfd5402147c","Type":"ContainerStarted","Data":"510993bff7ae71b2c880a116b1a7e44609a78ff31e40907e16b0dea0380aa767"} Apr 20 12:14:31.252065 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.251984 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" event={"ID":"a627833c-c9d5-450e-a3be-cae5d2eed758","Type":"ContainerStarted","Data":"70f822106e280da1be3df4cbdf0199c55693062d92b1aba4a9804fe7525f1652"} Apr 20 12:14:31.252149 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.252073 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" event={"ID":"a627833c-c9d5-450e-a3be-cae5d2eed758","Type":"ContainerStarted","Data":"bac3ab9c44af7fb3818d7212927b6fbe0d387a99a34a3ee3683896af74bc6da7"} Apr 20 12:14:31.252149 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.252089 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" event={"ID":"a627833c-c9d5-450e-a3be-cae5d2eed758","Type":"ContainerStarted","Data":"ae2e88403b4fcfdcfe59e6b894956c20fafdd170f9af4ef217e1033503421b94"} Apr 20 12:14:31.252149 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.252102 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" event={"ID":"a627833c-c9d5-450e-a3be-cae5d2eed758","Type":"ContainerStarted","Data":"8cecd6c2a52d45e68ff595d1241295ebdebe013aafe5f6d310e7deb46b3f80c1"} Apr 20 12:14:31.252149 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.252115 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" event={"ID":"a627833c-c9d5-450e-a3be-cae5d2eed758","Type":"ContainerStarted","Data":"ff4cb4e146cb8441f52905244d6f8c4ec9de2da46f0b9213c8f8dd86a5a196d5"} Apr 20 12:14:31.252149 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.252127 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" event={"ID":"a627833c-c9d5-450e-a3be-cae5d2eed758","Type":"ContainerStarted","Data":"9f5f08afb33cecd2f7746b1a86bcbcbf422656e0888dab902b54fe3674afd3b5"} Apr 20 12:14:31.252386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.252239 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9zdrr" podStartSLOduration=3.491418244 podStartE2EDuration="20.252227497s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:13.656266441 +0000 UTC m=+3.159825484" lastFinishedPulling="2026-04-20 12:14:30.417075696 +0000 UTC m=+19.920634737" observedRunningTime="2026-04-20 12:14:31.251644225 +0000 UTC m=+20.755203289" watchObservedRunningTime="2026-04-20 12:14:31.252227497 +0000 UTC m=+20.755786561" Apr 20 12:14:31.253491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.253467 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bszzh" event={"ID":"d45ac98c-ea8f-4ad5-8034-4c9981d9693a","Type":"ContainerStarted","Data":"29caf09a787ca0febbf9b0ee923a6b2be203d15fb8346482e86ceadbda1c927d"} Apr 20 12:14:31.270320 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.270287 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vrsq9" podStartSLOduration=3.505486555 podStartE2EDuration="20.270274678s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:13.656611994 +0000 UTC m=+3.160171036" lastFinishedPulling="2026-04-20 12:14:30.421400102 +0000 UTC m=+19.924959159" observedRunningTime="2026-04-20 12:14:31.269935804 +0000 UTC m=+20.773494881" watchObservedRunningTime="2026-04-20 12:14:31.270274678 +0000 UTC m=+20.773833740" Apr 20 12:14:31.307649 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.307609 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9nxbw" podStartSLOduration=3.540557027 podStartE2EDuration="20.30759591s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:13.654340744 +0000 UTC m=+3.157899803" lastFinishedPulling="2026-04-20 12:14:30.421379632 +0000 UTC m=+19.924938686" observedRunningTime="2026-04-20 12:14:31.30736205 +0000 UTC m=+20.810921112" watchObservedRunningTime="2026-04-20 12:14:31.30759591 +0000 UTC m=+20.811154976" Apr 20 12:14:31.322443 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.322399 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x4k8h" podStartSLOduration=11.565225439 podStartE2EDuration="20.322383907s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:13.655253805 +0000 UTC m=+3.158812862" lastFinishedPulling="2026-04-20 12:14:22.412412288 +0000 UTC m=+11.915971330" observedRunningTime="2026-04-20 12:14:31.32167839 +0000 UTC m=+20.825237452" watchObservedRunningTime="2026-04-20 12:14:31.322383907 +0000 UTC m=+20.825942972" Apr 20 12:14:31.336892 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.336848 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bszzh" podStartSLOduration=3.5987684939999998 podStartE2EDuration="20.33683799s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:13.649805119 +0000 UTC m=+3.153364165" lastFinishedPulling="2026-04-20 12:14:30.387874609 +0000 UTC m=+19.891433661" observedRunningTime="2026-04-20 12:14:31.336180613 +0000 UTC m=+20.839739676" watchObservedRunningTime="2026-04-20 12:14:31.33683799 +0000 UTC m=+20.840397053" Apr 20 12:14:31.603736 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:31.603711 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 12:14:32.075132 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:32.074780 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T12:14:31.603731484Z","UUID":"0d76dfb4-c351-4871-85fe-80dbdd5ea4c8","Handler":null,"Name":"","Endpoint":""} Apr 20 12:14:32.077943 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:32.077917 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 12:14:32.078107 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:32.077962 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 12:14:32.133965 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:32.133877 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:32.134118 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:32.133979 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:32.257957 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:32.257913 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t7cr8" event={"ID":"d4423a85-bbfb-4bda-a5c0-8729c2068a9b","Type":"ContainerStarted","Data":"2911d6a354453d632bcacb3e7cc1a4213642a48341f8030a49c821fb1df8dffb"} Apr 20 12:14:32.260462 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:32.260432 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" event={"ID":"a9b88536-8286-4c7c-8747-3f37573024a1","Type":"ContainerStarted","Data":"0e663d09ae02561aa22c1e4bb692492efcd8e38919295cf30917ecc8dad2a4f8"} Apr 20 12:14:32.279675 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:32.279627 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-t7cr8" podStartSLOduration=4.580559464 podStartE2EDuration="21.279611857s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:13.65756862 +0000 UTC m=+3.161127661" lastFinishedPulling="2026-04-20 12:14:30.356621013 +0000 UTC m=+19.860180054" observedRunningTime="2026-04-20 12:14:32.278670589 +0000 UTC m=+21.782229653" watchObservedRunningTime="2026-04-20 12:14:32.279611857 +0000 UTC m=+21.783170920" Apr 20 12:14:33.133667 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:33.133634 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:33.133667 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:33.133669 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:33.133931 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:33.133767 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:33.133931 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:33.133895 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:33.264653 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:33.264610 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" event={"ID":"a9b88536-8286-4c7c-8747-3f37573024a1","Type":"ContainerStarted","Data":"38da0cf12ebd238179ab1ff61a56259c077e3a56475df78d8995fda5ce7400ed"} Apr 20 12:14:33.286009 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:33.285966 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gzwxc" podStartSLOduration=3.525340435 podStartE2EDuration="22.285951511s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:13.648052881 +0000 UTC m=+3.151611925" lastFinishedPulling="2026-04-20 12:14:32.408663956 +0000 UTC m=+21.912223001" observedRunningTime="2026-04-20 12:14:33.285555317 +0000 UTC m=+22.789114385" watchObservedRunningTime="2026-04-20 12:14:33.285951511 +0000 UTC m=+22.789510574" Apr 20 12:14:34.134730 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:34.134692 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:34.134979 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:34.134824 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:34.269747 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:34.269715 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" event={"ID":"a627833c-c9d5-450e-a3be-cae5d2eed758","Type":"ContainerStarted","Data":"df490ca5ec4048484e26b5857857f4decfa7a157e474aa2ec213aa73d24242e6"} Apr 20 12:14:34.970512 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:34.970482 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:34.970679 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:34.970580 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:34.970679 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:34.970639 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret podName:32b53c98-449d-4d6d-9ec4-06d42d60860e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:42.970622279 +0000 UTC m=+32.474181320 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret") pod "global-pull-secret-syncer-2zqds" (UID: "32b53c98-449d-4d6d-9ec4-06d42d60860e") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:35.133762 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:35.133721 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:35.133875 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:35.133855 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:35.133940 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:35.133924 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:35.134086 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:35.134012 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:35.273091 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:35.273054 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerStarted","Data":"12eac52c914aba4efd46afcbcd2680b1982d385c98f0c5854e7062a2542213bb"} Apr 20 12:14:35.537141 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:35.536959 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:35.537633 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:35.537613 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:36.134412 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.134380 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:36.134575 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:36.134484 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:36.276211 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.276179 2570 generic.go:358] "Generic (PLEG): container finished" podID="865bf984-f3c6-4787-a88b-65144a2e4549" containerID="12eac52c914aba4efd46afcbcd2680b1982d385c98f0c5854e7062a2542213bb" exitCode=0 Apr 20 12:14:36.276663 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.276254 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerDied","Data":"12eac52c914aba4efd46afcbcd2680b1982d385c98f0c5854e7062a2542213bb"} Apr 20 12:14:36.279399 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.279376 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" event={"ID":"a627833c-c9d5-450e-a3be-cae5d2eed758","Type":"ContainerStarted","Data":"33f121aeffd1c19e394f2706344763b3f59ed70995766359fb9dbb1367f8d393"} Apr 20 12:14:36.279684 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.279596 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:36.279684 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.279636 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:36.279684 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.279650 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:36.279684 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.279661 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:36.280183 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.280169 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bszzh" Apr 20 12:14:36.293686 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.293669 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:36.293770 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.293724 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:14:36.337978 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:36.337938 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" podStartSLOduration=8.177973505 podStartE2EDuration="25.337926629s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:13.652194002 +0000 UTC m=+3.155753049" lastFinishedPulling="2026-04-20 12:14:30.812147127 +0000 UTC m=+20.315706173" observedRunningTime="2026-04-20 12:14:36.336379547 +0000 UTC m=+25.839938609" watchObservedRunningTime="2026-04-20 12:14:36.337926629 +0000 UTC m=+25.841485690" Apr 20 12:14:37.134127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.134096 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:37.134250 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.134109 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:37.134250 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:37.134229 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:37.134365 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:37.134273 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:37.283721 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.283688 2570 generic.go:358] "Generic (PLEG): container finished" podID="865bf984-f3c6-4787-a88b-65144a2e4549" containerID="5815f04017da35ac83fe404cdb55390357fa4296a2926db782701437251fe965" exitCode=0 Apr 20 12:14:37.284117 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.283731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerDied","Data":"5815f04017da35ac83fe404cdb55390357fa4296a2926db782701437251fe965"} Apr 20 12:14:37.442080 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.442046 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2zqds"] Apr 20 12:14:37.442245 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.442151 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:37.442245 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:37.442229 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:37.445371 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.445348 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hf6xh"] Apr 20 12:14:37.445478 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.445436 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:37.445521 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:37.445504 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:37.445879 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.445851 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8xfxk"] Apr 20 12:14:37.445975 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:37.445948 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:37.446090 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:37.446068 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:38.287507 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:38.287433 2570 generic.go:358] "Generic (PLEG): container finished" podID="865bf984-f3c6-4787-a88b-65144a2e4549" containerID="93ea862e3b5f0be45ab61166f6e303bdcc36542c7c7a16aa9c1930a58c1bcad9" exitCode=0 Apr 20 12:14:38.288036 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:38.287512 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerDied","Data":"93ea862e3b5f0be45ab61166f6e303bdcc36542c7c7a16aa9c1930a58c1bcad9"} Apr 20 12:14:39.134048 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:39.134001 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:39.134221 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:39.134001 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:39.134221 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:39.134137 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:39.134345 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:39.134230 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:39.134345 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:39.134273 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:39.134345 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:39.134326 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:41.134731 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:41.134689 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:41.135412 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:41.134792 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:41.135412 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:41.134870 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:41.135412 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:41.134964 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:41.135412 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:41.134997 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:41.135412 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:41.135045 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:43.030813 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.030775 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:43.031281 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.030944 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:43.031281 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.031034 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret podName:32b53c98-449d-4d6d-9ec4-06d42d60860e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.031000601 +0000 UTC m=+48.534559646 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret") pod "global-pull-secret-syncer-2zqds" (UID: "32b53c98-449d-4d6d-9ec4-06d42d60860e") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:43.134192 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.134163 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:43.134369 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.134199 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:43.134369 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.134233 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:43.134369 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.134336 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zqds" podUID="32b53c98-449d-4d6d-9ec4-06d42d60860e" Apr 20 12:14:43.134515 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.134466 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hf6xh" podUID="6b4ca455-a400-4bf3-8bd0-0b93d1456970" Apr 20 12:14:43.134591 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.134573 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xfxk" podUID="faa17428-5484-40d3-9fb5-b11e5a64f1be" Apr 20 12:14:43.272050 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.271993 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-187.ec2.internal" event="NodeReady" Apr 20 12:14:43.272198 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.272163 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 12:14:43.305128 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.305063 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5878cc68d6-ltsqp"] Apr 20 12:14:43.337567 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.337541 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nrx42"] Apr 20 12:14:43.337712 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.337694 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.340052 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.339952 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 12:14:43.340152 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.340088 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 12:14:43.340767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.340604 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 12:14:43.340767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.340672 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2vqb9\"" Apr 20 12:14:43.349161 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.348848 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 12:14:43.352071 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.352050 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q7wvt"] Apr 20 12:14:43.352219 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.352201 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.354508 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.354479 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 12:14:43.354508 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.354504 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8c8ph\"" Apr 20 12:14:43.354642 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.354533 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 12:14:43.373790 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.373773 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5878cc68d6-ltsqp"] Apr 20 12:14:43.373790 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.373793 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q7wvt"] Apr 20 12:14:43.373952 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.373802 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nrx42"] Apr 20 12:14:43.373952 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.373889 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:43.376478 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.376359 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 12:14:43.376478 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.376373 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g4vct\"" Apr 20 12:14:43.376478 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.376379 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 12:14:43.376656 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.376526 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 12:14:43.434472 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434438 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-image-registry-private-configuration\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.434472 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434469 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-certificates\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.434646 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434491 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.434646 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434538 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-ca-trust-extracted\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.434646 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434563 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj9lb\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-kube-api-access-nj9lb\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.434646 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434589 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grpgp\" (UniqueName: \"kubernetes.io/projected/0f53da55-04fb-46ea-a138-a50e7b354151-kube-api-access-grpgp\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.434646 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434617 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.434646 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434634 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f53da55-04fb-46ea-a138-a50e7b354151-tmp-dir\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.434874 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434651 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-installation-pull-secrets\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.434874 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434690 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-bound-sa-token\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.434874 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434763 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-trusted-ca\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.434874 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.434798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f53da55-04fb-46ea-a138-a50e7b354151-config-volume\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.535861 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.535830 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-ca-trust-extracted\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.535861 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.535860 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj9lb\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-kube-api-access-nj9lb\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.536132 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.535880 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grpgp\" (UniqueName: \"kubernetes.io/projected/0f53da55-04fb-46ea-a138-a50e7b354151-kube-api-access-grpgp\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.536132 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.535898 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.536132 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.535916 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f53da55-04fb-46ea-a138-a50e7b354151-tmp-dir\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.536132 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.535942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-installation-pull-secrets\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.536132 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.535986 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-bound-sa-token\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.536132 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.536010 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:43.536132 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536038 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-trusted-ca\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.536132 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.536048 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5878cc68d6-ltsqp: secret "image-registry-tls" not found Apr 20 12:14:43.536132 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.536105 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls podName:b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:44.036082983 +0000 UTC m=+33.539642039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls") pod "image-registry-5878cc68d6-ltsqp" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e") : secret "image-registry-tls" not found Apr 20 12:14:43.536533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f53da55-04fb-46ea-a138-a50e7b354151-tmp-dir\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.536533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f53da55-04fb-46ea-a138-a50e7b354151-config-volume\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.536533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536331 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-ca-trust-extracted\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.536533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536344 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2pfv\" (UniqueName: \"kubernetes.io/projected/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-kube-api-access-t2pfv\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:43.536533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536428 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-image-registry-private-configuration\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.536533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-certificates\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.536533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536491 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:43.536533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.536869 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.536627 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:43.536869 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.536684 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls podName:0f53da55-04fb-46ea-a138-a50e7b354151 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:44.036668063 +0000 UTC m=+33.540227107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls") pod "dns-default-nrx42" (UID: "0f53da55-04fb-46ea-a138-a50e7b354151") : secret "dns-default-metrics-tls" not found Apr 20 12:14:43.536869 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.536793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f53da55-04fb-46ea-a138-a50e7b354151-config-volume\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.537039 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.537006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-trusted-ca\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.537087 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.537074 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-certificates\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.541204 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.541083 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-installation-pull-secrets\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.541310 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.541098 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-image-registry-private-configuration\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.543870 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.543850 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-bound-sa-token\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.543944 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.543928 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj9lb\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-kube-api-access-nj9lb\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:43.544054 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.544035 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grpgp\" (UniqueName: \"kubernetes.io/projected/0f53da55-04fb-46ea-a138-a50e7b354151-kube-api-access-grpgp\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:43.637433 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.637364 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2pfv\" (UniqueName: \"kubernetes.io/projected/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-kube-api-access-t2pfv\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:43.637433 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.637406 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:43.637553 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.637497 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:43.637553 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:43.637549 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert podName:5ffce074-ab2a-4172-a6a3-7e85c82f6eb8 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:44.137534377 +0000 UTC m=+33.641093423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert") pod "ingress-canary-q7wvt" (UID: "5ffce074-ab2a-4172-a6a3-7e85c82f6eb8") : secret "canary-serving-cert" not found Apr 20 12:14:43.645238 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:43.645217 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2pfv\" (UniqueName: \"kubernetes.io/projected/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-kube-api-access-t2pfv\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:44.040830 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:44.040804 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:44.041162 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:44.040847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:44.041162 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.040936 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:44.041162 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.040945 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5878cc68d6-ltsqp: secret "image-registry-tls" not found Apr 20 12:14:44.041162 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.040947 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:44.041162 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.040988 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls podName:b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:45.040976356 +0000 UTC m=+34.544535397 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls") pod "image-registry-5878cc68d6-ltsqp" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e") : secret "image-registry-tls" not found Apr 20 12:14:44.041162 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.041000 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls podName:0f53da55-04fb-46ea-a138-a50e7b354151 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:45.040994853 +0000 UTC m=+34.544553894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls") pod "dns-default-nrx42" (UID: "0f53da55-04fb-46ea-a138-a50e7b354151") : secret "dns-default-metrics-tls" not found Apr 20 12:14:44.141467 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:44.141445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:44.141600 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.141582 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:44.141650 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.141640 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert podName:5ffce074-ab2a-4172-a6a3-7e85c82f6eb8 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:45.141626565 +0000 UTC m=+34.645185606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert") pod "ingress-canary-q7wvt" (UID: "5ffce074-ab2a-4172-a6a3-7e85c82f6eb8") : secret "canary-serving-cert" not found Apr 20 12:14:44.302605 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:44.302577 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerStarted","Data":"34c933ce80a859c3790af5df36c5adaa5e383cdeea586e1602991c7b7a043cd7"} Apr 20 12:14:44.745486 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:44.745411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:44.745624 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.745566 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:44.745624 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:44.745574 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:44.745624 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.745587 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:44.745713 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.745627 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:44.745713 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.745637 2570 projected.go:194] Error preparing data for projected volume kube-api-access-ldmgk for pod openshift-network-diagnostics/network-check-target-hf6xh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:44.745713 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.745675 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs podName:faa17428-5484-40d3-9fb5-b11e5a64f1be nodeName:}" failed. No retries permitted until 2026-04-20 12:15:16.745660507 +0000 UTC m=+66.249219548 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs") pod "network-metrics-daemon-8xfxk" (UID: "faa17428-5484-40d3-9fb5-b11e5a64f1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:44.745713 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:44.745687 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk podName:6b4ca455-a400-4bf3-8bd0-0b93d1456970 nodeName:}" failed. No retries permitted until 2026-04-20 12:15:16.745681399 +0000 UTC m=+66.249240440 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldmgk" (UniqueName: "kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk") pod "network-check-target-hf6xh" (UID: "6b4ca455-a400-4bf3-8bd0-0b93d1456970") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:45.047268 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.047195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:45.047616 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.047312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:45.047616 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:45.047341 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:45.047616 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:45.047357 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5878cc68d6-ltsqp: secret "image-registry-tls" not found Apr 20 12:14:45.047616 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:45.047394 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:45.047616 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:45.047418 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls podName:b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:47.047399531 +0000 UTC m=+36.550958574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls") pod "image-registry-5878cc68d6-ltsqp" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e") : secret "image-registry-tls" not found Apr 20 12:14:45.047616 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:45.047434 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls podName:0f53da55-04fb-46ea-a138-a50e7b354151 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:47.047425172 +0000 UTC m=+36.550984220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls") pod "dns-default-nrx42" (UID: "0f53da55-04fb-46ea-a138-a50e7b354151") : secret "dns-default-metrics-tls" not found Apr 20 12:14:45.136376 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.136356 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:14:45.136512 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.136356 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:14:45.136551 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.136356 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:45.138631 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.138602 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 12:14:45.139688 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.139667 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p76zv\"" Apr 20 12:14:45.139907 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.139695 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 12:14:45.139907 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.139707 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 12:14:45.139907 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.139695 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 12:14:45.139907 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.139700 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kwdzh\"" Apr 20 12:14:45.148821 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.148797 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:45.148921 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:45.148906 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:45.149071 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:45.149051 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert podName:5ffce074-ab2a-4172-a6a3-7e85c82f6eb8 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:47.149011531 +0000 UTC m=+36.652570572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert") pod "ingress-canary-q7wvt" (UID: "5ffce074-ab2a-4172-a6a3-7e85c82f6eb8") : secret "canary-serving-cert" not found Apr 20 12:14:45.306993 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.306912 2570 generic.go:358] "Generic (PLEG): container finished" podID="865bf984-f3c6-4787-a88b-65144a2e4549" containerID="34c933ce80a859c3790af5df36c5adaa5e383cdeea586e1602991c7b7a043cd7" exitCode=0 Apr 20 12:14:45.306993 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:45.306967 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerDied","Data":"34c933ce80a859c3790af5df36c5adaa5e383cdeea586e1602991c7b7a043cd7"} Apr 20 12:14:46.311466 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.311434 2570 generic.go:358] "Generic (PLEG): container finished" podID="865bf984-f3c6-4787-a88b-65144a2e4549" containerID="78b54b037c58d634d2deb13d64d923f5d9647df27e7dfbb1a67fe134264da8f9" exitCode=0 Apr 20 12:14:46.311791 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.311498 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerDied","Data":"78b54b037c58d634d2deb13d64d923f5d9647df27e7dfbb1a67fe134264da8f9"} Apr 20 12:14:46.525984 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.525957 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p"] Apr 20 12:14:46.529450 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.529430 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p" Apr 20 12:14:46.533035 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.533004 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 12:14:46.533239 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.533226 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-mktmp\"" Apr 20 12:14:46.533396 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.533378 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:46.547047 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.547012 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p"] Apr 20 12:14:46.660923 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.660850 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w52mh\" (UniqueName: \"kubernetes.io/projected/0a26209c-50bb-48d5-947b-95041bb256df-kube-api-access-w52mh\") pod \"migrator-74bb7799d9-bfd6p\" (UID: \"0a26209c-50bb-48d5-947b-95041bb256df\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p" Apr 20 12:14:46.761653 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.761620 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w52mh\" (UniqueName: \"kubernetes.io/projected/0a26209c-50bb-48d5-947b-95041bb256df-kube-api-access-w52mh\") pod \"migrator-74bb7799d9-bfd6p\" (UID: \"0a26209c-50bb-48d5-947b-95041bb256df\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p" Apr 20 12:14:46.770224 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.770199 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w52mh\" (UniqueName: \"kubernetes.io/projected/0a26209c-50bb-48d5-947b-95041bb256df-kube-api-access-w52mh\") pod \"migrator-74bb7799d9-bfd6p\" (UID: \"0a26209c-50bb-48d5-947b-95041bb256df\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p" Apr 20 12:14:46.838053 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.838012 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p" Apr 20 12:14:46.995162 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:46.994980 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p"] Apr 20 12:14:46.999569 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:46.999541 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a26209c_50bb_48d5_947b_95041bb256df.slice/crio-1312e21fea3e90d302c96852866e4de5b782bc6bf7620b19f69387c6d3d97ee9 WatchSource:0}: Error finding container 1312e21fea3e90d302c96852866e4de5b782bc6bf7620b19f69387c6d3d97ee9: Status 404 returned error can't find the container with id 1312e21fea3e90d302c96852866e4de5b782bc6bf7620b19f69387c6d3d97ee9 Apr 20 12:14:47.063661 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:47.063629 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:47.063749 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:47.063675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:47.063794 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:47.063756 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:47.063794 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:47.063768 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5878cc68d6-ltsqp: secret "image-registry-tls" not found Apr 20 12:14:47.063794 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:47.063766 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:47.063885 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:47.063821 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls podName:b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:51.063805475 +0000 UTC m=+40.567364517 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls") pod "image-registry-5878cc68d6-ltsqp" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e") : secret "image-registry-tls" not found Apr 20 12:14:47.063885 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:47.063834 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls podName:0f53da55-04fb-46ea-a138-a50e7b354151 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:51.063828266 +0000 UTC m=+40.567387307 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls") pod "dns-default-nrx42" (UID: "0f53da55-04fb-46ea-a138-a50e7b354151") : secret "dns-default-metrics-tls" not found Apr 20 12:14:47.165087 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:47.165061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:47.165206 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:47.165191 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:47.165255 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:47.165246 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert podName:5ffce074-ab2a-4172-a6a3-7e85c82f6eb8 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:51.165231233 +0000 UTC m=+40.668790275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert") pod "ingress-canary-q7wvt" (UID: "5ffce074-ab2a-4172-a6a3-7e85c82f6eb8") : secret "canary-serving-cert" not found Apr 20 12:14:47.316343 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:47.316271 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" event={"ID":"865bf984-f3c6-4787-a88b-65144a2e4549","Type":"ContainerStarted","Data":"dd935d4ba997d261c68cb6a46f36b7586ae288ccdfcf567a2640bcbb70f20344"} Apr 20 12:14:47.317250 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:47.317218 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p" event={"ID":"0a26209c-50bb-48d5-947b-95041bb256df","Type":"ContainerStarted","Data":"1312e21fea3e90d302c96852866e4de5b782bc6bf7620b19f69387c6d3d97ee9"} Apr 20 12:14:47.338995 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:47.338944 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zbrg9" podStartSLOduration=5.996359759 podStartE2EDuration="36.338930592s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:13.651229516 +0000 UTC m=+3.154788571" lastFinishedPulling="2026-04-20 12:14:43.993800362 +0000 UTC m=+33.497359404" observedRunningTime="2026-04-20 12:14:47.336480619 +0000 UTC m=+36.840039682" watchObservedRunningTime="2026-04-20 12:14:47.338930592 +0000 UTC m=+36.842489653" Apr 20 12:14:48.489771 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.489744 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-c2d98"] Apr 20 12:14:48.502189 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.502161 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-c2d98"] Apr 20 12:14:48.502314 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.502260 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.504729 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.504706 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 12:14:48.505734 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.505691 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 12:14:48.505734 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.505711 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-4b5hc\"" Apr 20 12:14:48.505734 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.505715 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 12:14:48.505734 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.505733 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 12:14:48.576428 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.576408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0761a9c6-50b8-4f03-b0c7-65ce207c869d-signing-cabundle\") pod \"service-ca-865cb79987-c2d98\" (UID: \"0761a9c6-50b8-4f03-b0c7-65ce207c869d\") " pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.576538 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.576446 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pttgc\" (UniqueName: \"kubernetes.io/projected/0761a9c6-50b8-4f03-b0c7-65ce207c869d-kube-api-access-pttgc\") pod \"service-ca-865cb79987-c2d98\" (UID: \"0761a9c6-50b8-4f03-b0c7-65ce207c869d\") " pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.576588 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.576570 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0761a9c6-50b8-4f03-b0c7-65ce207c869d-signing-key\") pod \"service-ca-865cb79987-c2d98\" (UID: \"0761a9c6-50b8-4f03-b0c7-65ce207c869d\") " pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.677128 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.677056 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0761a9c6-50b8-4f03-b0c7-65ce207c869d-signing-key\") pod \"service-ca-865cb79987-c2d98\" (UID: \"0761a9c6-50b8-4f03-b0c7-65ce207c869d\") " pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.677128 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.677111 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0761a9c6-50b8-4f03-b0c7-65ce207c869d-signing-cabundle\") pod \"service-ca-865cb79987-c2d98\" (UID: \"0761a9c6-50b8-4f03-b0c7-65ce207c869d\") " pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.677231 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.677163 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pttgc\" (UniqueName: \"kubernetes.io/projected/0761a9c6-50b8-4f03-b0c7-65ce207c869d-kube-api-access-pttgc\") pod \"service-ca-865cb79987-c2d98\" (UID: \"0761a9c6-50b8-4f03-b0c7-65ce207c869d\") " pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.677822 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.677799 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0761a9c6-50b8-4f03-b0c7-65ce207c869d-signing-cabundle\") pod \"service-ca-865cb79987-c2d98\" (UID: \"0761a9c6-50b8-4f03-b0c7-65ce207c869d\") " pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.680271 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.680251 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0761a9c6-50b8-4f03-b0c7-65ce207c869d-signing-key\") pod \"service-ca-865cb79987-c2d98\" (UID: \"0761a9c6-50b8-4f03-b0c7-65ce207c869d\") " pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.686116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.686092 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pttgc\" (UniqueName: \"kubernetes.io/projected/0761a9c6-50b8-4f03-b0c7-65ce207c869d-kube-api-access-pttgc\") pod \"service-ca-865cb79987-c2d98\" (UID: \"0761a9c6-50b8-4f03-b0c7-65ce207c869d\") " pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.740909 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.740861 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9zdrr_751a35e3-65ef-4efa-80f2-9cecd4a7b003/dns-node-resolver/0.log" Apr 20 12:14:48.811889 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.811860 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-c2d98" Apr 20 12:14:48.929656 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:48.929629 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-c2d98"] Apr 20 12:14:48.932463 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:48.932437 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0761a9c6_50b8_4f03_b0c7_65ce207c869d.slice/crio-84c4c908795213a205075a50b62ae78454aa394fa652b46b0863793b75744c36 WatchSource:0}: Error finding container 84c4c908795213a205075a50b62ae78454aa394fa652b46b0863793b75744c36: Status 404 returned error can't find the container with id 84c4c908795213a205075a50b62ae78454aa394fa652b46b0863793b75744c36 Apr 20 12:14:49.322531 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:49.322499 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p" event={"ID":"0a26209c-50bb-48d5-947b-95041bb256df","Type":"ContainerStarted","Data":"ca1aa02e50c1f28a9111007741556033345c428e24f3b4884167c8dee05e0175"} Apr 20 12:14:49.322677 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:49.322538 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p" event={"ID":"0a26209c-50bb-48d5-947b-95041bb256df","Type":"ContainerStarted","Data":"c04c78ad45144d3cde4e5e046cb515385e214f7d49a62b5d4c2a4a5f889dcbdb"} Apr 20 12:14:49.323496 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:49.323473 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-c2d98" event={"ID":"0761a9c6-50b8-4f03-b0c7-65ce207c869d","Type":"ContainerStarted","Data":"84c4c908795213a205075a50b62ae78454aa394fa652b46b0863793b75744c36"} Apr 20 12:14:49.340662 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:49.340619 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bfd6p" podStartSLOduration=1.689496959 podStartE2EDuration="3.340606723s" podCreationTimestamp="2026-04-20 12:14:46 +0000 UTC" firstStartedPulling="2026-04-20 12:14:47.001540328 +0000 UTC m=+36.505099383" lastFinishedPulling="2026-04-20 12:14:48.65265009 +0000 UTC m=+38.156209147" observedRunningTime="2026-04-20 12:14:49.340182597 +0000 UTC m=+38.843741659" watchObservedRunningTime="2026-04-20 12:14:49.340606723 +0000 UTC m=+38.844165786" Apr 20 12:14:49.944707 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:49.944677 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x4k8h_3f8fe785-c3f9-4d97-9683-833f64ab21aa/node-ca/0.log" Apr 20 12:14:50.941299 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:50.941273 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bfd6p_0a26209c-50bb-48d5-947b-95041bb256df/migrator/0.log" Apr 20 12:14:51.098679 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:51.098647 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:51.099003 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:51.098691 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:51.099003 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:51.098775 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:51.099003 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:51.098792 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:51.099003 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:51.098803 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5878cc68d6-ltsqp: secret "image-registry-tls" not found Apr 20 12:14:51.099003 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:51.098829 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls podName:0f53da55-04fb-46ea-a138-a50e7b354151 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.098815096 +0000 UTC m=+48.602374137 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls") pod "dns-default-nrx42" (UID: "0f53da55-04fb-46ea-a138-a50e7b354151") : secret "dns-default-metrics-tls" not found Apr 20 12:14:51.099003 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:51.098843 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls podName:b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.098836915 +0000 UTC m=+48.602395956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls") pod "image-registry-5878cc68d6-ltsqp" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e") : secret "image-registry-tls" not found Apr 20 12:14:51.147760 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:51.147727 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bfd6p_0a26209c-50bb-48d5-947b-95041bb256df/graceful-termination/0.log" Apr 20 12:14:51.199250 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:51.199186 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:51.199347 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:51.199316 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:51.199391 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:14:51.199369 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert podName:5ffce074-ab2a-4172-a6a3-7e85c82f6eb8 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.199352974 +0000 UTC m=+48.702912016 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert") pod "ingress-canary-q7wvt" (UID: "5ffce074-ab2a-4172-a6a3-7e85c82f6eb8") : secret "canary-serving-cert" not found Apr 20 12:14:52.330383 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:52.330349 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-c2d98" event={"ID":"0761a9c6-50b8-4f03-b0c7-65ce207c869d","Type":"ContainerStarted","Data":"cc285678f8747f80b34fb2bc3ae3d500ece0c2b56b540abc26754be6e7016111"} Apr 20 12:14:52.346954 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:52.346907 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-c2d98" podStartSLOduration=2.064968667 podStartE2EDuration="4.346894129s" podCreationTimestamp="2026-04-20 12:14:48 +0000 UTC" firstStartedPulling="2026-04-20 12:14:48.946603139 +0000 UTC m=+38.450162183" lastFinishedPulling="2026-04-20 12:14:51.228528604 +0000 UTC m=+40.732087645" observedRunningTime="2026-04-20 12:14:52.346075205 +0000 UTC m=+41.849634269" watchObservedRunningTime="2026-04-20 12:14:52.346894129 +0000 UTC m=+41.850453191" Apr 20 12:14:59.062235 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.062203 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:59.064610 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.064590 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32b53c98-449d-4d6d-9ec4-06d42d60860e-original-pull-secret\") pod \"global-pull-secret-syncer-2zqds\" (UID: \"32b53c98-449d-4d6d-9ec4-06d42d60860e\") " pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:59.162607 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.162584 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:59.162736 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.162656 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:59.164825 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.164796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f53da55-04fb-46ea-a138-a50e7b354151-metrics-tls\") pod \"dns-default-nrx42\" (UID: \"0f53da55-04fb-46ea-a138-a50e7b354151\") " pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:59.164926 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.164892 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") pod \"image-registry-5878cc68d6-ltsqp\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:59.251944 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.251914 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:14:59.257620 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.257595 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zqds" Apr 20 12:14:59.261214 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.261192 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrx42" Apr 20 12:14:59.263002 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.262969 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:59.267313 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.267283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ffce074-ab2a-4172-a6a3-7e85c82f6eb8-cert\") pod \"ingress-canary-q7wvt\" (UID: \"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8\") " pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:59.284166 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.284130 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q7wvt" Apr 20 12:14:59.398914 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.398884 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5878cc68d6-ltsqp"] Apr 20 12:14:59.402259 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:59.402228 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a9cea2_2ef1_4fb3_9bf6_1758705cc19e.slice/crio-4ef9215b57420b97382b6c4afee6acef82d72b751fef45693f4515da996cf524 WatchSource:0}: Error finding container 4ef9215b57420b97382b6c4afee6acef82d72b751fef45693f4515da996cf524: Status 404 returned error can't find the container with id 4ef9215b57420b97382b6c4afee6acef82d72b751fef45693f4515da996cf524 Apr 20 12:14:59.437079 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.437056 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q7wvt"] Apr 20 12:14:59.471545 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:59.471526 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ffce074_ab2a_4172_a6a3_7e85c82f6eb8.slice/crio-bbfdd90a42784c971ed171a65fb702f3d40b1b1f5c83550908b42a3db3c410dd WatchSource:0}: Error finding container bbfdd90a42784c971ed171a65fb702f3d40b1b1f5c83550908b42a3db3c410dd: Status 404 returned error can't find the container with id bbfdd90a42784c971ed171a65fb702f3d40b1b1f5c83550908b42a3db3c410dd Apr 20 12:14:59.616694 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.616634 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2zqds"] Apr 20 12:14:59.619128 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:59.619098 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b53c98_449d_4d6d_9ec4_06d42d60860e.slice/crio-8f4c6bea80150b708c2715bc2e262f257eceef830e33e002f41b28e117bc4aa6 WatchSource:0}: Error finding container 8f4c6bea80150b708c2715bc2e262f257eceef830e33e002f41b28e117bc4aa6: Status 404 returned error can't find the container with id 8f4c6bea80150b708c2715bc2e262f257eceef830e33e002f41b28e117bc4aa6 Apr 20 12:14:59.621146 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:14:59.621125 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nrx42"] Apr 20 12:14:59.623545 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:14:59.623523 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f53da55_04fb_46ea_a138_a50e7b354151.slice/crio-12abd5b2d9c842d0b29b6652d6468f443259eb8755b58ff96578af10b438fdaa WatchSource:0}: Error finding container 12abd5b2d9c842d0b29b6652d6468f443259eb8755b58ff96578af10b438fdaa: Status 404 returned error can't find the container with id 12abd5b2d9c842d0b29b6652d6468f443259eb8755b58ff96578af10b438fdaa Apr 20 12:15:00.350045 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:00.349676 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" event={"ID":"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e","Type":"ContainerStarted","Data":"de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc"} Apr 20 12:15:00.350045 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:00.349726 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" event={"ID":"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e","Type":"ContainerStarted","Data":"4ef9215b57420b97382b6c4afee6acef82d72b751fef45693f4515da996cf524"} Apr 20 12:15:00.350533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:00.350106 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:15:00.351538 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:00.351450 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrx42" event={"ID":"0f53da55-04fb-46ea-a138-a50e7b354151","Type":"ContainerStarted","Data":"12abd5b2d9c842d0b29b6652d6468f443259eb8755b58ff96578af10b438fdaa"} Apr 20 12:15:00.355248 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:00.355222 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q7wvt" event={"ID":"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8","Type":"ContainerStarted","Data":"bbfdd90a42784c971ed171a65fb702f3d40b1b1f5c83550908b42a3db3c410dd"} Apr 20 12:15:00.357365 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:00.357340 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2zqds" event={"ID":"32b53c98-449d-4d6d-9ec4-06d42d60860e","Type":"ContainerStarted","Data":"8f4c6bea80150b708c2715bc2e262f257eceef830e33e002f41b28e117bc4aa6"} Apr 20 12:15:00.372472 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:00.372429 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" podStartSLOduration=27.372413921 podStartE2EDuration="27.372413921s" podCreationTimestamp="2026-04-20 12:14:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:15:00.370448912 +0000 UTC m=+49.874007978" watchObservedRunningTime="2026-04-20 12:15:00.372413921 +0000 UTC m=+49.875972983" Apr 20 12:15:02.366953 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:02.366920 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q7wvt" event={"ID":"5ffce074-ab2a-4172-a6a3-7e85c82f6eb8","Type":"ContainerStarted","Data":"cf8b05a887fde83e9e72d6efa5d67dc46ffe6ce28e3800cc30044aa013766f17"} Apr 20 12:15:02.369167 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:02.369108 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrx42" event={"ID":"0f53da55-04fb-46ea-a138-a50e7b354151","Type":"ContainerStarted","Data":"f142a2f3783893c7c4f210691fc7fcc6c4101a2ad9d23ad61348a08a2d157852"} Apr 20 12:15:02.383228 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:02.383179 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q7wvt" podStartSLOduration=16.818783446 podStartE2EDuration="19.383143806s" podCreationTimestamp="2026-04-20 12:14:43 +0000 UTC" firstStartedPulling="2026-04-20 12:14:59.474220328 +0000 UTC m=+48.977779369" lastFinishedPulling="2026-04-20 12:15:02.038580677 +0000 UTC m=+51.542139729" observedRunningTime="2026-04-20 12:15:02.382771452 +0000 UTC m=+51.886330794" watchObservedRunningTime="2026-04-20 12:15:02.383143806 +0000 UTC m=+51.886702870" Apr 20 12:15:03.374047 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:03.373996 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrx42" event={"ID":"0f53da55-04fb-46ea-a138-a50e7b354151","Type":"ContainerStarted","Data":"2de12b790e404f1b6fa6915a48d05902ff229f59b10eaf4be7b8f27d2196bef4"} Apr 20 12:15:03.391892 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:03.391836 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nrx42" podStartSLOduration=17.977751390999998 podStartE2EDuration="20.391818031s" podCreationTimestamp="2026-04-20 12:14:43 +0000 UTC" firstStartedPulling="2026-04-20 12:14:59.625160565 +0000 UTC m=+49.128719607" lastFinishedPulling="2026-04-20 12:15:02.039227201 +0000 UTC m=+51.542786247" observedRunningTime="2026-04-20 12:15:03.390463424 +0000 UTC m=+52.894022486" watchObservedRunningTime="2026-04-20 12:15:03.391818031 +0000 UTC m=+52.895377095" Apr 20 12:15:04.377381 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:04.377343 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2zqds" event={"ID":"32b53c98-449d-4d6d-9ec4-06d42d60860e","Type":"ContainerStarted","Data":"eb939b170d0094125b06fbc8d9d7a5dba5a1c4167867e2e88173c3baa9353abd"} Apr 20 12:15:04.377747 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:04.377627 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nrx42" Apr 20 12:15:04.393556 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:04.393515 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2zqds" podStartSLOduration=33.102661067 podStartE2EDuration="37.393502771s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:14:59.620965897 +0000 UTC m=+49.124524938" lastFinishedPulling="2026-04-20 12:15:03.911807593 +0000 UTC m=+53.415366642" observedRunningTime="2026-04-20 12:15:04.393294018 +0000 UTC m=+53.896853080" watchObservedRunningTime="2026-04-20 12:15:04.393502771 +0000 UTC m=+53.897061826" Apr 20 12:15:08.297784 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:08.297759 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pw7dn" Apr 20 12:15:10.908133 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.908105 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-86khj"] Apr 20 12:15:10.912884 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.912867 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-86khj" Apr 20 12:15:10.915731 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.915706 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 12:15:10.915863 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.915842 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 12:15:10.916263 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.916243 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-ld9bg\"" Apr 20 12:15:10.916685 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.916666 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t"] Apr 20 12:15:10.920735 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.920717 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" Apr 20 12:15:10.924335 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.924318 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-g426n\"" Apr 20 12:15:10.924677 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.924661 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 12:15:10.927107 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.927090 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5878cc68d6-ltsqp"] Apr 20 12:15:10.927839 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.927821 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-86khj"] Apr 20 12:15:10.934445 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.934429 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t"] Apr 20 12:15:10.951573 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.951555 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w2pd\" (UniqueName: \"kubernetes.io/projected/3b4b233a-eebb-419f-81c4-b65496b65e9b-kube-api-access-5w2pd\") pod \"downloads-6bcc868b7-86khj\" (UID: \"3b4b233a-eebb-419f-81c4-b65496b65e9b\") " pod="openshift-console/downloads-6bcc868b7-86khj" Apr 20 12:15:10.951654 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:10.951587 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5f94d305-89ae-4c03-800a-7c2bce1b4fd1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jxh2t\" (UID: \"5f94d305-89ae-4c03-800a-7c2bce1b4fd1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" Apr 20 12:15:11.021233 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.021209 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6cfd894b8b-drw6h"] Apr 20 12:15:11.024479 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.024462 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.041209 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.041186 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cfd894b8b-drw6h"] Apr 20 12:15:11.052161 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052139 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aef389fc-fc83-4809-871c-658c7804d8e6-image-registry-private-configuration\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.052268 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aef389fc-fc83-4809-871c-658c7804d8e6-bound-sa-token\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.052268 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052209 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aef389fc-fc83-4809-871c-658c7804d8e6-installation-pull-secrets\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.052370 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aef389fc-fc83-4809-871c-658c7804d8e6-ca-trust-extracted\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.052412 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052375 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aef389fc-fc83-4809-871c-658c7804d8e6-registry-tls\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.052412 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aef389fc-fc83-4809-871c-658c7804d8e6-registry-certificates\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.052506 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w2pd\" (UniqueName: \"kubernetes.io/projected/3b4b233a-eebb-419f-81c4-b65496b65e9b-kube-api-access-5w2pd\") pod \"downloads-6bcc868b7-86khj\" (UID: \"3b4b233a-eebb-419f-81c4-b65496b65e9b\") " pod="openshift-console/downloads-6bcc868b7-86khj" Apr 20 12:15:11.052506 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052491 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkbl2\" (UniqueName: \"kubernetes.io/projected/aef389fc-fc83-4809-871c-658c7804d8e6-kube-api-access-wkbl2\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.052599 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052531 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5f94d305-89ae-4c03-800a-7c2bce1b4fd1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jxh2t\" (UID: \"5f94d305-89ae-4c03-800a-7c2bce1b4fd1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" Apr 20 12:15:11.052599 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.052557 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aef389fc-fc83-4809-871c-658c7804d8e6-trusted-ca\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.055505 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.055489 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 12:15:11.065749 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.065729 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5f94d305-89ae-4c03-800a-7c2bce1b4fd1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jxh2t\" (UID: \"5f94d305-89ae-4c03-800a-7c2bce1b4fd1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" Apr 20 12:15:11.075200 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.075181 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 12:15:11.084155 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.084137 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 12:15:11.094607 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.094582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w2pd\" (UniqueName: \"kubernetes.io/projected/3b4b233a-eebb-419f-81c4-b65496b65e9b-kube-api-access-5w2pd\") pod \"downloads-6bcc868b7-86khj\" (UID: \"3b4b233a-eebb-419f-81c4-b65496b65e9b\") " pod="openshift-console/downloads-6bcc868b7-86khj" Apr 20 12:15:11.127442 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.127417 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6kp8w"] Apr 20 12:15:11.130696 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.130681 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.133697 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.133679 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 12:15:11.133916 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.133882 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 12:15:11.134061 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.134045 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-cqzj8\"" Apr 20 12:15:11.134882 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.134866 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 12:15:11.134963 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.134892 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 12:15:11.148677 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.148658 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6kp8w"] Apr 20 12:15:11.153115 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aef389fc-fc83-4809-871c-658c7804d8e6-registry-tls\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.153273 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153122 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aef389fc-fc83-4809-871c-658c7804d8e6-registry-certificates\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.153273 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c2666143-2c9f-4812-bb5e-895b1c4c4891-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.153273 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkbl2\" (UniqueName: \"kubernetes.io/projected/aef389fc-fc83-4809-871c-658c7804d8e6-kube-api-access-wkbl2\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.153273 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aef389fc-fc83-4809-871c-658c7804d8e6-trusted-ca\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.153273 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153209 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c2666143-2c9f-4812-bb5e-895b1c4c4891-crio-socket\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.153481 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153314 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tktmg\" (UniqueName: \"kubernetes.io/projected/c2666143-2c9f-4812-bb5e-895b1c4c4891-kube-api-access-tktmg\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.153481 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aef389fc-fc83-4809-871c-658c7804d8e6-image-registry-private-configuration\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.153481 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153381 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aef389fc-fc83-4809-871c-658c7804d8e6-bound-sa-token\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.153481 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153409 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c2666143-2c9f-4812-bb5e-895b1c4c4891-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.153481 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2666143-2c9f-4812-bb5e-895b1c4c4891-data-volume\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.153649 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aef389fc-fc83-4809-871c-658c7804d8e6-installation-pull-secrets\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.153649 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aef389fc-fc83-4809-871c-658c7804d8e6-ca-trust-extracted\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.153945 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.153915 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aef389fc-fc83-4809-871c-658c7804d8e6-ca-trust-extracted\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.154302 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.154276 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aef389fc-fc83-4809-871c-658c7804d8e6-registry-certificates\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.154407 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.154361 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aef389fc-fc83-4809-871c-658c7804d8e6-trusted-ca\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.156624 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.156582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aef389fc-fc83-4809-871c-658c7804d8e6-image-registry-private-configuration\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.156836 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.156810 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aef389fc-fc83-4809-871c-658c7804d8e6-registry-tls\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.156891 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.156858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aef389fc-fc83-4809-871c-658c7804d8e6-installation-pull-secrets\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.164757 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.164692 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkbl2\" (UniqueName: \"kubernetes.io/projected/aef389fc-fc83-4809-871c-658c7804d8e6-kube-api-access-wkbl2\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.165358 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.165316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aef389fc-fc83-4809-871c-658c7804d8e6-bound-sa-token\") pod \"image-registry-6cfd894b8b-drw6h\" (UID: \"aef389fc-fc83-4809-871c-658c7804d8e6\") " pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.224580 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.224558 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-ld9bg\"" Apr 20 12:15:11.231913 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.231895 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-g426n\"" Apr 20 12:15:11.232938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.232919 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-86khj" Apr 20 12:15:11.240262 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.240239 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" Apr 20 12:15:11.254127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.254100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c2666143-2c9f-4812-bb5e-895b1c4c4891-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.254217 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.254147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2666143-2c9f-4812-bb5e-895b1c4c4891-data-volume\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.254305 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.254289 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c2666143-2c9f-4812-bb5e-895b1c4c4891-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.254475 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.254447 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c2666143-2c9f-4812-bb5e-895b1c4c4891-crio-socket\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.254589 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.254505 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2666143-2c9f-4812-bb5e-895b1c4c4891-data-volume\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.254589 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.254505 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tktmg\" (UniqueName: \"kubernetes.io/projected/c2666143-2c9f-4812-bb5e-895b1c4c4891-kube-api-access-tktmg\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.254589 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.254557 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c2666143-2c9f-4812-bb5e-895b1c4c4891-crio-socket\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.254808 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.254794 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c2666143-2c9f-4812-bb5e-895b1c4c4891-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.256307 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.256289 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c2666143-2c9f-4812-bb5e-895b1c4c4891-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.263983 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.263961 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tktmg\" (UniqueName: \"kubernetes.io/projected/c2666143-2c9f-4812-bb5e-895b1c4c4891-kube-api-access-tktmg\") pod \"insights-runtime-extractor-6kp8w\" (UID: \"c2666143-2c9f-4812-bb5e-895b1c4c4891\") " pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.332424 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.332384 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:11.363899 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.363870 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-86khj"] Apr 20 12:15:11.367207 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:15:11.367174 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4b233a_eebb_419f_81c4_b65496b65e9b.slice/crio-827484553e3481582476fbb2f48a641a45834330dc3c4eac2e77e83cbd8eb0be WatchSource:0}: Error finding container 827484553e3481582476fbb2f48a641a45834330dc3c4eac2e77e83cbd8eb0be: Status 404 returned error can't find the container with id 827484553e3481582476fbb2f48a641a45834330dc3c4eac2e77e83cbd8eb0be Apr 20 12:15:11.378613 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.378583 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t"] Apr 20 12:15:11.385815 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:15:11.385768 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f94d305_89ae_4c03_800a_7c2bce1b4fd1.slice/crio-3193e00e0c99b83fc0373b53d4c7fe15ca522ae56c2b454c3927282efabdc44f WatchSource:0}: Error finding container 3193e00e0c99b83fc0373b53d4c7fe15ca522ae56c2b454c3927282efabdc44f: Status 404 returned error can't find the container with id 3193e00e0c99b83fc0373b53d4c7fe15ca522ae56c2b454c3927282efabdc44f Apr 20 12:15:11.397891 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.397854 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" event={"ID":"5f94d305-89ae-4c03-800a-7c2bce1b4fd1","Type":"ContainerStarted","Data":"3193e00e0c99b83fc0373b53d4c7fe15ca522ae56c2b454c3927282efabdc44f"} Apr 20 12:15:11.399004 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.398963 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-86khj" event={"ID":"3b4b233a-eebb-419f-81c4-b65496b65e9b","Type":"ContainerStarted","Data":"827484553e3481582476fbb2f48a641a45834330dc3c4eac2e77e83cbd8eb0be"} Apr 20 12:15:11.440376 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.440315 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6kp8w" Apr 20 12:15:11.465286 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.465259 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cfd894b8b-drw6h"] Apr 20 12:15:11.468812 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:15:11.468786 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef389fc_fc83_4809_871c_658c7804d8e6.slice/crio-3f51524ee1c20354b899e02f055eec9ae248d56f9503cf6ccf9d59509752a9fb WatchSource:0}: Error finding container 3f51524ee1c20354b899e02f055eec9ae248d56f9503cf6ccf9d59509752a9fb: Status 404 returned error can't find the container with id 3f51524ee1c20354b899e02f055eec9ae248d56f9503cf6ccf9d59509752a9fb Apr 20 12:15:11.559978 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:11.559952 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6kp8w"] Apr 20 12:15:11.563306 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:15:11.563280 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2666143_2c9f_4812_bb5e_895b1c4c4891.slice/crio-2fe2577e7e729fadfb38296aee7b4eef041f89e392e2ce51365cdb907c4ec4a5 WatchSource:0}: Error finding container 2fe2577e7e729fadfb38296aee7b4eef041f89e392e2ce51365cdb907c4ec4a5: Status 404 returned error can't find the container with id 2fe2577e7e729fadfb38296aee7b4eef041f89e392e2ce51365cdb907c4ec4a5 Apr 20 12:15:12.414000 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:12.413927 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6kp8w" event={"ID":"c2666143-2c9f-4812-bb5e-895b1c4c4891","Type":"ContainerStarted","Data":"6777475f1b4865b740646a3225c7fbe321d7c1e340b4f4535fe0dc3d84e33d01"} Apr 20 12:15:12.414000 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:12.413974 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6kp8w" event={"ID":"c2666143-2c9f-4812-bb5e-895b1c4c4891","Type":"ContainerStarted","Data":"2fe2577e7e729fadfb38296aee7b4eef041f89e392e2ce51365cdb907c4ec4a5"} Apr 20 12:15:12.416069 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:12.416040 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" event={"ID":"aef389fc-fc83-4809-871c-658c7804d8e6","Type":"ContainerStarted","Data":"c2fae76c80ab7fe5eb72d1dd273809817c6508cd76bfd063ac4aecdd1a05b9d3"} Apr 20 12:15:12.416196 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:12.416076 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" event={"ID":"aef389fc-fc83-4809-871c-658c7804d8e6","Type":"ContainerStarted","Data":"3f51524ee1c20354b899e02f055eec9ae248d56f9503cf6ccf9d59509752a9fb"} Apr 20 12:15:12.416428 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:12.416410 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:12.436107 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:12.436061 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" podStartSLOduration=2.436049672 podStartE2EDuration="2.436049672s" podCreationTimestamp="2026-04-20 12:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:15:12.434650717 +0000 UTC m=+61.938209781" watchObservedRunningTime="2026-04-20 12:15:12.436049672 +0000 UTC m=+61.939608733" Apr 20 12:15:13.421513 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:13.421462 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" event={"ID":"5f94d305-89ae-4c03-800a-7c2bce1b4fd1","Type":"ContainerStarted","Data":"724ee566e1be0fd29e0b02a343c2262fb54fc028fce6c6e2601da8356a9316de"} Apr 20 12:15:13.423329 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:13.423287 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6kp8w" event={"ID":"c2666143-2c9f-4812-bb5e-895b1c4c4891","Type":"ContainerStarted","Data":"49b692bfe936ce7fc3875ac2f435e074ee709c06c79e08313ea2a3ddee491861"} Apr 20 12:15:13.437208 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:13.437158 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" podStartSLOduration=2.006318326 podStartE2EDuration="3.437145334s" podCreationTimestamp="2026-04-20 12:15:10 +0000 UTC" firstStartedPulling="2026-04-20 12:15:11.388164663 +0000 UTC m=+60.891723704" lastFinishedPulling="2026-04-20 12:15:12.818991638 +0000 UTC m=+62.322550712" observedRunningTime="2026-04-20 12:15:13.436628426 +0000 UTC m=+62.940187492" watchObservedRunningTime="2026-04-20 12:15:13.437145334 +0000 UTC m=+62.940704433" Apr 20 12:15:14.382502 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.382469 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nrx42" Apr 20 12:15:14.426705 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.426648 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" Apr 20 12:15:14.431997 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.431971 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jxh2t" Apr 20 12:15:14.633596 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.633515 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65c7948d4-8dwqb"] Apr 20 12:15:14.661383 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.661363 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65c7948d4-8dwqb"] Apr 20 12:15:14.661515 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.661463 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.663923 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.663900 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 12:15:14.663923 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.663924 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 12:15:14.664260 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.664237 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 12:15:14.664369 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.664273 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 12:15:14.664369 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.664335 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 12:15:14.664578 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.664553 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jfmjp\"" Apr 20 12:15:14.683428 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.683399 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-oauth-config\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.683531 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.683440 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-config\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.683531 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.683472 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc2fc\" (UniqueName: \"kubernetes.io/projected/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-kube-api-access-mc2fc\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.683614 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.683564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-serving-cert\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.683614 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.683584 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-service-ca\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.683614 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.683601 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-oauth-serving-cert\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.783928 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.783895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-oauth-config\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.784090 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.783938 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-config\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.784090 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.783988 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mc2fc\" (UniqueName: \"kubernetes.io/projected/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-kube-api-access-mc2fc\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.784090 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.784053 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-serving-cert\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.784090 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.784073 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-service-ca\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.784090 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.784088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-oauth-serving-cert\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.784826 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.784787 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-oauth-serving-cert\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.784826 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.784787 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-service-ca\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.784991 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.784864 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-config\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.786518 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.786497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-serving-cert\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.787013 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.786991 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-oauth-config\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.792356 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.792332 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc2fc\" (UniqueName: \"kubernetes.io/projected/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-kube-api-access-mc2fc\") pod \"console-65c7948d4-8dwqb\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:14.972373 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:14.972295 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:15.112389 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.112356 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65c7948d4-8dwqb"] Apr 20 12:15:15.115797 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:15:15.115771 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e627f2c_6478_4328_8c1e_b69ef7ae9b92.slice/crio-07def7002d7f89f9090dd3f81ea9a86a7dd7460791f5a6ac7050ca6546ad3098 WatchSource:0}: Error finding container 07def7002d7f89f9090dd3f81ea9a86a7dd7460791f5a6ac7050ca6546ad3098: Status 404 returned error can't find the container with id 07def7002d7f89f9090dd3f81ea9a86a7dd7460791f5a6ac7050ca6546ad3098 Apr 20 12:15:15.434549 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.434511 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6kp8w" event={"ID":"c2666143-2c9f-4812-bb5e-895b1c4c4891","Type":"ContainerStarted","Data":"01ac23c7f25975d92a53e831641f840ca551fe36bac4ea85f514194071405e10"} Apr 20 12:15:15.435658 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.435629 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65c7948d4-8dwqb" event={"ID":"7e627f2c-6478-4328-8c1e-b69ef7ae9b92","Type":"ContainerStarted","Data":"07def7002d7f89f9090dd3f81ea9a86a7dd7460791f5a6ac7050ca6546ad3098"} Apr 20 12:15:15.452416 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.452361 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6kp8w" podStartSLOduration=1.619745529 podStartE2EDuration="4.452345778s" podCreationTimestamp="2026-04-20 12:15:11 +0000 UTC" firstStartedPulling="2026-04-20 12:15:11.612943212 +0000 UTC m=+61.116502256" lastFinishedPulling="2026-04-20 12:15:14.445543464 +0000 UTC m=+63.949102505" observedRunningTime="2026-04-20 12:15:15.451384303 +0000 UTC m=+64.954943372" watchObservedRunningTime="2026-04-20 12:15:15.452345778 +0000 UTC m=+64.955904842" Apr 20 12:15:15.534173 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.534133 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vmbsf"] Apr 20 12:15:15.572926 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.572898 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vmbsf"] Apr 20 12:15:15.573147 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.573078 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.575799 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.575775 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 12:15:15.576173 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.576151 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 12:15:15.576294 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.576192 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 12:15:15.576294 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.576192 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 12:15:15.576738 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.576719 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-7fpdz\"" Apr 20 12:15:15.576809 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.576750 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 12:15:15.691081 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.690945 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a13f073-4cfd-494b-8515-5ab165e362bf-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.691234 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.691101 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a13f073-4cfd-494b-8515-5ab165e362bf-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.691234 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.691141 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a13f073-4cfd-494b-8515-5ab165e362bf-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.691234 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.691185 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trdwk\" (UniqueName: \"kubernetes.io/projected/7a13f073-4cfd-494b-8515-5ab165e362bf-kube-api-access-trdwk\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.792060 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.792029 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a13f073-4cfd-494b-8515-5ab165e362bf-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.792060 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.792064 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a13f073-4cfd-494b-8515-5ab165e362bf-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.792290 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.792090 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trdwk\" (UniqueName: \"kubernetes.io/projected/7a13f073-4cfd-494b-8515-5ab165e362bf-kube-api-access-trdwk\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.792290 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.792135 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a13f073-4cfd-494b-8515-5ab165e362bf-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.792823 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.792797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a13f073-4cfd-494b-8515-5ab165e362bf-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.795139 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.795113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a13f073-4cfd-494b-8515-5ab165e362bf-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.795329 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.795274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a13f073-4cfd-494b-8515-5ab165e362bf-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.804617 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.804591 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trdwk\" (UniqueName: \"kubernetes.io/projected/7a13f073-4cfd-494b-8515-5ab165e362bf-kube-api-access-trdwk\") pod \"prometheus-operator-5676c8c784-vmbsf\" (UID: \"7a13f073-4cfd-494b-8515-5ab165e362bf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:15.884816 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:15.884779 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" Apr 20 12:15:16.052341 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.052288 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vmbsf"] Apr 20 12:15:16.056010 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:15:16.055974 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a13f073_4cfd_494b_8515_5ab165e362bf.slice/crio-5c4b9e67856704290cc6fecc1d6658331b817f35833a302e0eb1b59667b384b0 WatchSource:0}: Error finding container 5c4b9e67856704290cc6fecc1d6658331b817f35833a302e0eb1b59667b384b0: Status 404 returned error can't find the container with id 5c4b9e67856704290cc6fecc1d6658331b817f35833a302e0eb1b59667b384b0 Apr 20 12:15:16.440481 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.440444 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" event={"ID":"7a13f073-4cfd-494b-8515-5ab165e362bf","Type":"ContainerStarted","Data":"5c4b9e67856704290cc6fecc1d6658331b817f35833a302e0eb1b59667b384b0"} Apr 20 12:15:16.803246 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.802762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:15:16.803246 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.802829 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:15:16.805792 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.805631 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 12:15:16.806587 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.806565 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 12:15:16.816072 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.816050 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 12:15:16.820287 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.820264 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa17428-5484-40d3-9fb5-b11e5a64f1be-metrics-certs\") pod \"network-metrics-daemon-8xfxk\" (UID: \"faa17428-5484-40d3-9fb5-b11e5a64f1be\") " pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:15:16.827969 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.827925 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmgk\" (UniqueName: \"kubernetes.io/projected/6b4ca455-a400-4bf3-8bd0-0b93d1456970-kube-api-access-ldmgk\") pod \"network-check-target-hf6xh\" (UID: \"6b4ca455-a400-4bf3-8bd0-0b93d1456970\") " pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:15:16.947771 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.947737 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kwdzh\"" Apr 20 12:15:16.955833 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.955761 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xfxk" Apr 20 12:15:16.956294 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.956266 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p76zv\"" Apr 20 12:15:16.964835 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:16.964814 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:15:18.684505 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:18.684459 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hf6xh"] Apr 20 12:15:18.699157 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:15:18.699125 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4ca455_a400_4bf3_8bd0_0b93d1456970.slice/crio-c9036176c31e672a85d48013aea254614d9f2eca09d546033232404a625d4608 WatchSource:0}: Error finding container c9036176c31e672a85d48013aea254614d9f2eca09d546033232404a625d4608: Status 404 returned error can't find the container with id c9036176c31e672a85d48013aea254614d9f2eca09d546033232404a625d4608 Apr 20 12:15:18.702976 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:18.702950 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8xfxk"] Apr 20 12:15:18.707074 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:15:18.707044 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa17428_5484_40d3_9fb5_b11e5a64f1be.slice/crio-015d266ad84c29044e11c89e18e96c2bccf8ab3f668e59af64961207b7008411 WatchSource:0}: Error finding container 015d266ad84c29044e11c89e18e96c2bccf8ab3f668e59af64961207b7008411: Status 404 returned error can't find the container with id 015d266ad84c29044e11c89e18e96c2bccf8ab3f668e59af64961207b7008411 Apr 20 12:15:19.453511 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:19.453471 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65c7948d4-8dwqb" event={"ID":"7e627f2c-6478-4328-8c1e-b69ef7ae9b92","Type":"ContainerStarted","Data":"dcb38064a7c3d92d1a01caaf0d24126273a60e922b60b533b53a016958a78fb0"} Apr 20 12:15:19.455367 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:19.455329 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8xfxk" event={"ID":"faa17428-5484-40d3-9fb5-b11e5a64f1be","Type":"ContainerStarted","Data":"015d266ad84c29044e11c89e18e96c2bccf8ab3f668e59af64961207b7008411"} Apr 20 12:15:19.458081 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:19.458011 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" event={"ID":"7a13f073-4cfd-494b-8515-5ab165e362bf","Type":"ContainerStarted","Data":"fc13d2d03459a85f3bf090da2c5a593692502e183d5e1f61f436cdee663c063d"} Apr 20 12:15:19.458081 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:19.458059 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" event={"ID":"7a13f073-4cfd-494b-8515-5ab165e362bf","Type":"ContainerStarted","Data":"f62d42920f50a9c09c6f8210b9271651d0739da3e8d6aaa6f930aaa9103dca8d"} Apr 20 12:15:19.459509 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:19.459488 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hf6xh" event={"ID":"6b4ca455-a400-4bf3-8bd0-0b93d1456970","Type":"ContainerStarted","Data":"c9036176c31e672a85d48013aea254614d9f2eca09d546033232404a625d4608"} Apr 20 12:15:19.475177 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:19.474685 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65c7948d4-8dwqb" podStartSLOduration=2.05332619 podStartE2EDuration="5.474667782s" podCreationTimestamp="2026-04-20 12:15:14 +0000 UTC" firstStartedPulling="2026-04-20 12:15:15.11804362 +0000 UTC m=+64.621602668" lastFinishedPulling="2026-04-20 12:15:18.539385203 +0000 UTC m=+68.042944260" observedRunningTime="2026-04-20 12:15:19.473059652 +0000 UTC m=+68.976618715" watchObservedRunningTime="2026-04-20 12:15:19.474667782 +0000 UTC m=+68.978226846" Apr 20 12:15:19.488528 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:19.488483 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-vmbsf" podStartSLOduration=2.008207395 podStartE2EDuration="4.488468262s" podCreationTimestamp="2026-04-20 12:15:15 +0000 UTC" firstStartedPulling="2026-04-20 12:15:16.058368549 +0000 UTC m=+65.561927591" lastFinishedPulling="2026-04-20 12:15:18.538629414 +0000 UTC m=+68.042188458" observedRunningTime="2026-04-20 12:15:19.488382537 +0000 UTC m=+68.991941602" watchObservedRunningTime="2026-04-20 12:15:19.488468262 +0000 UTC m=+68.992027326" Apr 20 12:15:20.465185 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.465085 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8xfxk" event={"ID":"faa17428-5484-40d3-9fb5-b11e5a64f1be","Type":"ContainerStarted","Data":"e7a35058a1d3b1c7040ee4e176c8a439b05cc7851352b7f7c627db564494e25f"} Apr 20 12:15:20.465185 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.465142 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8xfxk" event={"ID":"faa17428-5484-40d3-9fb5-b11e5a64f1be","Type":"ContainerStarted","Data":"5b41cd14bf294d7ff5e84377d0543e8efd54b25ac77a0cf595f4e0140cb56aeb"} Apr 20 12:15:20.481982 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.481932 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8xfxk" podStartSLOduration=68.27249581 podStartE2EDuration="1m9.481914147s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:15:18.708799569 +0000 UTC m=+68.212358624" lastFinishedPulling="2026-04-20 12:15:19.918217919 +0000 UTC m=+69.421776961" observedRunningTime="2026-04-20 12:15:20.479570021 +0000 UTC m=+69.983129084" watchObservedRunningTime="2026-04-20 12:15:20.481914147 +0000 UTC m=+69.985473212" Apr 20 12:15:20.934248 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.934216 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4jct8"] Apr 20 12:15:20.938613 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.938578 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:15:20.938759 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.938716 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:20.940813 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.940794 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 12:15:20.941778 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.941749 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 12:15:20.941882 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.941804 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 12:15:20.941945 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:20.941749 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-jsw86\"" Apr 20 12:15:21.037866 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.037469 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-accelerators-collector-config\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.037866 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.037512 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.037866 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.037535 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8322defd-f262-40d7-95f0-1e747662e436-sys\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.037866 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.037550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-tls\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.037866 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.037580 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-wtmp\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.037866 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.037641 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8322defd-f262-40d7-95f0-1e747662e436-root\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.037866 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.037668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg474\" (UniqueName: \"kubernetes.io/projected/8322defd-f262-40d7-95f0-1e747662e436-kube-api-access-sg474\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.037866 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.037706 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8322defd-f262-40d7-95f0-1e747662e436-metrics-client-ca\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.037866 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.037742 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-textfile\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.138728 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.138694 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-textfile\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.138893 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.138746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-accelerators-collector-config\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.138893 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.138776 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.138893 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.138806 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8322defd-f262-40d7-95f0-1e747662e436-sys\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.138893 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.138864 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8322defd-f262-40d7-95f0-1e747662e436-sys\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139130 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.138900 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-tls\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139130 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.138935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-wtmp\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139130 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.138975 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8322defd-f262-40d7-95f0-1e747662e436-root\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139130 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.138998 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg474\" (UniqueName: \"kubernetes.io/projected/8322defd-f262-40d7-95f0-1e747662e436-kube-api-access-sg474\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139130 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.139054 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8322defd-f262-40d7-95f0-1e747662e436-metrics-client-ca\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139366 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.139135 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-textfile\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139366 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.139211 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8322defd-f262-40d7-95f0-1e747662e436-root\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139366 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.139332 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-wtmp\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139531 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.139448 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-accelerators-collector-config\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.139582 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.139525 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8322defd-f262-40d7-95f0-1e747662e436-metrics-client-ca\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.141627 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.141607 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-tls\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.142127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.142098 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8322defd-f262-40d7-95f0-1e747662e436-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.147824 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.147802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg474\" (UniqueName: \"kubernetes.io/projected/8322defd-f262-40d7-95f0-1e747662e436-kube-api-access-sg474\") pod \"node-exporter-4jct8\" (UID: \"8322defd-f262-40d7-95f0-1e747662e436\") " pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:21.251111 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:21.251042 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4jct8" Apr 20 12:15:24.973386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:24.973345 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:24.973841 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:24.973399 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:24.978942 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:24.978916 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:25.485676 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:25.485643 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:15:29.216345 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:15:29.216292 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8322defd_f262_40d7_95f0_1e747662e436.slice/crio-4c9d49ed1802243365578c3f84b4ab5e698b0f098eeb130e5cfa7beea34dbed0 WatchSource:0}: Error finding container 4c9d49ed1802243365578c3f84b4ab5e698b0f098eeb130e5cfa7beea34dbed0: Status 404 returned error can't find the container with id 4c9d49ed1802243365578c3f84b4ab5e698b0f098eeb130e5cfa7beea34dbed0 Apr 20 12:15:29.496524 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:29.496431 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hf6xh" event={"ID":"6b4ca455-a400-4bf3-8bd0-0b93d1456970","Type":"ContainerStarted","Data":"fb37f1d1aee68af833109b84f126770be3c27f5bd4e91a3cf7e998edc841d456"} Apr 20 12:15:29.496669 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:29.496528 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:15:29.497948 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:29.497901 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-86khj" event={"ID":"3b4b233a-eebb-419f-81c4-b65496b65e9b","Type":"ContainerStarted","Data":"24e33861b23facedb55736d2d70e4f6a1f876221652ba9af98f3b541fa646d7e"} Apr 20 12:15:29.499559 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:29.498270 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-86khj" Apr 20 12:15:29.499694 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:29.499639 2570 patch_prober.go:28] interesting pod/downloads-6bcc868b7-86khj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.11:8080/\": dial tcp 10.133.0.11:8080: connect: connection refused" start-of-body= Apr 20 12:15:29.499756 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:29.499683 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-86khj" podUID="3b4b233a-eebb-419f-81c4-b65496b65e9b" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.11:8080/\": dial tcp 10.133.0.11:8080: connect: connection refused" Apr 20 12:15:29.500386 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:29.500361 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4jct8" event={"ID":"8322defd-f262-40d7-95f0-1e747662e436","Type":"ContainerStarted","Data":"4c9d49ed1802243365578c3f84b4ab5e698b0f098eeb130e5cfa7beea34dbed0"} Apr 20 12:15:29.514592 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:29.514549 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hf6xh" podStartSLOduration=67.995042264 podStartE2EDuration="1m18.514537038s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:15:18.704408992 +0000 UTC m=+68.207968034" lastFinishedPulling="2026-04-20 12:15:29.223903761 +0000 UTC m=+78.727462808" observedRunningTime="2026-04-20 12:15:29.513762039 +0000 UTC m=+79.017321126" watchObservedRunningTime="2026-04-20 12:15:29.514537038 +0000 UTC m=+79.018096080" Apr 20 12:15:29.528805 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:29.528763 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-86khj" podStartSLOduration=1.671324772 podStartE2EDuration="19.528726262s" podCreationTimestamp="2026-04-20 12:15:10 +0000 UTC" firstStartedPulling="2026-04-20 12:15:11.369068826 +0000 UTC m=+60.872627870" lastFinishedPulling="2026-04-20 12:15:29.226470319 +0000 UTC m=+78.730029360" observedRunningTime="2026-04-20 12:15:29.528580269 +0000 UTC m=+79.032139338" watchObservedRunningTime="2026-04-20 12:15:29.528726262 +0000 UTC m=+79.032285339" Apr 20 12:15:30.516310 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:30.516282 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-86khj" Apr 20 12:15:31.512896 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:31.512858 2570 generic.go:358] "Generic (PLEG): container finished" podID="8322defd-f262-40d7-95f0-1e747662e436" containerID="e39f990db64cb628a6e651ad8768a3f49259ad3713ef4e466356efcd0a116c79" exitCode=0 Apr 20 12:15:31.513598 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:31.513573 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4jct8" event={"ID":"8322defd-f262-40d7-95f0-1e747662e436","Type":"ContainerDied","Data":"e39f990db64cb628a6e651ad8768a3f49259ad3713ef4e466356efcd0a116c79"} Apr 20 12:15:32.518884 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:32.518839 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4jct8" event={"ID":"8322defd-f262-40d7-95f0-1e747662e436","Type":"ContainerStarted","Data":"f6dc9323d0fb934f14c10f8dcabf327cab021698497a40d6a1bdf686f95ace5c"} Apr 20 12:15:32.519325 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:32.518892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4jct8" event={"ID":"8322defd-f262-40d7-95f0-1e747662e436","Type":"ContainerStarted","Data":"a17aa188451ce7845b3898ed653f667f07af38a01920fe7117dc7c00abce6cee"} Apr 20 12:15:32.537146 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:32.537088 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4jct8" podStartSLOduration=11.289500552 podStartE2EDuration="12.537070331s" podCreationTimestamp="2026-04-20 12:15:20 +0000 UTC" firstStartedPulling="2026-04-20 12:15:29.218297535 +0000 UTC m=+78.721856582" lastFinishedPulling="2026-04-20 12:15:30.465867314 +0000 UTC m=+79.969426361" observedRunningTime="2026-04-20 12:15:32.536101112 +0000 UTC m=+82.039660176" watchObservedRunningTime="2026-04-20 12:15:32.537070331 +0000 UTC m=+82.040629385" Apr 20 12:15:33.427773 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:33.427737 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6cfd894b8b-drw6h" Apr 20 12:15:35.946660 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:35.946589 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" podUID="b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" containerName="registry" containerID="cri-o://de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc" gracePeriod=30 Apr 20 12:15:36.211108 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.211079 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:15:36.372567 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.372531 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-certificates\") pod \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " Apr 20 12:15:36.372743 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.372578 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") pod \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " Apr 20 12:15:36.372743 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.372615 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-image-registry-private-configuration\") pod \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " Apr 20 12:15:36.372743 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.372645 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-bound-sa-token\") pod \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " Apr 20 12:15:36.372743 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.372684 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-ca-trust-extracted\") pod \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " Apr 20 12:15:36.372743 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.372717 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-installation-pull-secrets\") pod \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " Apr 20 12:15:36.373005 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.372757 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj9lb\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-kube-api-access-nj9lb\") pod \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " Apr 20 12:15:36.373005 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.372782 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-trusted-ca\") pod \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\" (UID: \"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e\") " Apr 20 12:15:36.373005 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.372936 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:15:36.373503 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.373398 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:15:36.375387 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.375273 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:15:36.375502 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.375354 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:15:36.375502 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.375475 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:15:36.375502 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.375485 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:15:36.375834 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.375792 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-kube-api-access-nj9lb" (OuterVolumeSpecName: "kube-api-access-nj9lb") pod "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e"). InnerVolumeSpecName "kube-api-access-nj9lb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:15:36.383967 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.383945 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" (UID: "b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:15:36.474217 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.474162 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-ca-trust-extracted\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:15:36.474217 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.474187 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-installation-pull-secrets\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:15:36.474217 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.474197 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nj9lb\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-kube-api-access-nj9lb\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:15:36.474217 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.474207 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-trusted-ca\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:15:36.474217 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.474216 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-certificates\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:15:36.474478 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.474225 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-registry-tls\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:15:36.474478 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.474236 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-image-registry-private-configuration\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:15:36.474478 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.474245 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e-bound-sa-token\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:15:36.534426 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.534394 2570 generic.go:358] "Generic (PLEG): container finished" podID="b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" containerID="de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc" exitCode=0 Apr 20 12:15:36.534543 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.534438 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" event={"ID":"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e","Type":"ContainerDied","Data":"de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc"} Apr 20 12:15:36.534543 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.534466 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" event={"ID":"b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e","Type":"ContainerDied","Data":"4ef9215b57420b97382b6c4afee6acef82d72b751fef45693f4515da996cf524"} Apr 20 12:15:36.534543 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.534487 2570 scope.go:117] "RemoveContainer" containerID="de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc" Apr 20 12:15:36.534543 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.534489 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5878cc68d6-ltsqp" Apr 20 12:15:36.546842 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.546822 2570 scope.go:117] "RemoveContainer" containerID="de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc" Apr 20 12:15:36.547145 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:15:36.547117 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc\": container with ID starting with de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc not found: ID does not exist" containerID="de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc" Apr 20 12:15:36.547258 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.547155 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc"} err="failed to get container status \"de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc\": rpc error: code = NotFound desc = could not find container \"de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc\": container with ID starting with de340ea63161c512dced4faa936d080cd068f014cd06eeee5e9dc5e13750cbbc not found: ID does not exist" Apr 20 12:15:36.557149 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.557128 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5878cc68d6-ltsqp"] Apr 20 12:15:36.561756 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:36.561734 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5878cc68d6-ltsqp"] Apr 20 12:15:37.139084 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:37.139041 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" path="/var/lib/kubelet/pods/b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e/volumes" Apr 20 12:15:41.365566 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:15:41.365534 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65c7948d4-8dwqb"] Apr 20 12:16:00.505092 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:00.505061 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hf6xh" Apr 20 12:16:06.384706 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.384668 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65c7948d4-8dwqb" podUID="7e627f2c-6478-4328-8c1e-b69ef7ae9b92" containerName="console" containerID="cri-o://dcb38064a7c3d92d1a01caaf0d24126273a60e922b60b533b53a016958a78fb0" gracePeriod=15 Apr 20 12:16:06.617205 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.617159 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65c7948d4-8dwqb_7e627f2c-6478-4328-8c1e-b69ef7ae9b92/console/0.log" Apr 20 12:16:06.617205 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.617193 2570 generic.go:358] "Generic (PLEG): container finished" podID="7e627f2c-6478-4328-8c1e-b69ef7ae9b92" containerID="dcb38064a7c3d92d1a01caaf0d24126273a60e922b60b533b53a016958a78fb0" exitCode=2 Apr 20 12:16:06.617385 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.617267 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65c7948d4-8dwqb" event={"ID":"7e627f2c-6478-4328-8c1e-b69ef7ae9b92","Type":"ContainerDied","Data":"dcb38064a7c3d92d1a01caaf0d24126273a60e922b60b533b53a016958a78fb0"} Apr 20 12:16:06.663013 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.662994 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65c7948d4-8dwqb_7e627f2c-6478-4328-8c1e-b69ef7ae9b92/console/0.log" Apr 20 12:16:06.663121 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.663063 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:16:06.792804 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.792766 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-oauth-serving-cert\") pod \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " Apr 20 12:16:06.792960 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.792812 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc2fc\" (UniqueName: \"kubernetes.io/projected/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-kube-api-access-mc2fc\") pod \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " Apr 20 12:16:06.792960 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.792830 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-service-ca\") pod \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " Apr 20 12:16:06.792960 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.792874 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-serving-cert\") pod \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " Apr 20 12:16:06.792960 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.792898 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-config\") pod \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " Apr 20 12:16:06.793193 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.793088 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-oauth-config\") pod \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\" (UID: \"7e627f2c-6478-4328-8c1e-b69ef7ae9b92\") " Apr 20 12:16:06.793320 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.793295 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-config" (OuterVolumeSpecName: "console-config") pod "7e627f2c-6478-4328-8c1e-b69ef7ae9b92" (UID: "7e627f2c-6478-4328-8c1e-b69ef7ae9b92"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:06.793320 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.793307 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-service-ca" (OuterVolumeSpecName: "service-ca") pod "7e627f2c-6478-4328-8c1e-b69ef7ae9b92" (UID: "7e627f2c-6478-4328-8c1e-b69ef7ae9b92"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:06.793430 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.793350 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7e627f2c-6478-4328-8c1e-b69ef7ae9b92" (UID: "7e627f2c-6478-4328-8c1e-b69ef7ae9b92"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:06.794935 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.794912 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7e627f2c-6478-4328-8c1e-b69ef7ae9b92" (UID: "7e627f2c-6478-4328-8c1e-b69ef7ae9b92"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:06.795031 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.794998 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-kube-api-access-mc2fc" (OuterVolumeSpecName: "kube-api-access-mc2fc") pod "7e627f2c-6478-4328-8c1e-b69ef7ae9b92" (UID: "7e627f2c-6478-4328-8c1e-b69ef7ae9b92"). InnerVolumeSpecName "kube-api-access-mc2fc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:16:06.795080 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.795042 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7e627f2c-6478-4328-8c1e-b69ef7ae9b92" (UID: "7e627f2c-6478-4328-8c1e-b69ef7ae9b92"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:06.894557 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.894533 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mc2fc\" (UniqueName: \"kubernetes.io/projected/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-kube-api-access-mc2fc\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:16:06.894557 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.894555 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-service-ca\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:16:06.894672 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.894564 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-serving-cert\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:16:06.894672 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.894574 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-config\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:16:06.894672 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.894582 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-console-oauth-config\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:16:06.894672 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:06.894591 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e627f2c-6478-4328-8c1e-b69ef7ae9b92-oauth-serving-cert\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:16:07.624543 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:07.624514 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65c7948d4-8dwqb_7e627f2c-6478-4328-8c1e-b69ef7ae9b92/console/0.log" Apr 20 12:16:07.624938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:07.624588 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65c7948d4-8dwqb" event={"ID":"7e627f2c-6478-4328-8c1e-b69ef7ae9b92","Type":"ContainerDied","Data":"07def7002d7f89f9090dd3f81ea9a86a7dd7460791f5a6ac7050ca6546ad3098"} Apr 20 12:16:07.624938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:07.624619 2570 scope.go:117] "RemoveContainer" containerID="dcb38064a7c3d92d1a01caaf0d24126273a60e922b60b533b53a016958a78fb0" Apr 20 12:16:07.624938 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:07.624639 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65c7948d4-8dwqb" Apr 20 12:16:07.640557 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:07.640531 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65c7948d4-8dwqb"] Apr 20 12:16:07.644484 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:07.644464 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65c7948d4-8dwqb"] Apr 20 12:16:09.138187 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:09.138155 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e627f2c-6478-4328-8c1e-b69ef7ae9b92" path="/var/lib/kubelet/pods/7e627f2c-6478-4328-8c1e-b69ef7ae9b92/volumes" Apr 20 12:16:49.226387 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.226352 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d8d46fdcc-qg25j"] Apr 20 12:16:49.226969 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.226617 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" containerName="registry" Apr 20 12:16:49.226969 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.226629 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" containerName="registry" Apr 20 12:16:49.226969 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.226642 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e627f2c-6478-4328-8c1e-b69ef7ae9b92" containerName="console" Apr 20 12:16:49.226969 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.226647 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e627f2c-6478-4328-8c1e-b69ef7ae9b92" containerName="console" Apr 20 12:16:49.226969 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.226692 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e627f2c-6478-4328-8c1e-b69ef7ae9b92" containerName="console" Apr 20 12:16:49.226969 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.226701 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4a9cea2-2ef1-4fb3-9bf6-1758705cc19e" containerName="registry" Apr 20 12:16:49.229638 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.229621 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.232148 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.232123 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 12:16:49.232309 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.232149 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 12:16:49.232420 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.232213 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 12:16:49.233359 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.233340 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 12:16:49.233449 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.233439 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jfmjp\"" Apr 20 12:16:49.233700 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.233682 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 12:16:49.239996 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.239971 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d8d46fdcc-qg25j"] Apr 20 12:16:49.240578 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.240554 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 12:16:49.260915 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.260886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-service-ca\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.261116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.260979 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-trusted-ca-bundle\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.261116 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.261063 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-console-config\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.261230 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.261160 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpvkj\" (UniqueName: \"kubernetes.io/projected/b9670129-978f-4a45-872f-414790b7e80f-kube-api-access-zpvkj\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.261230 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.261220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-oauth-config\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.261319 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.261251 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-serving-cert\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.261319 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.261279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-oauth-serving-cert\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.361830 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.361780 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpvkj\" (UniqueName: \"kubernetes.io/projected/b9670129-978f-4a45-872f-414790b7e80f-kube-api-access-zpvkj\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.362011 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.361849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-oauth-config\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.362011 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.361879 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-serving-cert\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.362011 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.361906 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-oauth-serving-cert\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.362011 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.361933 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-service-ca\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.362011 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.361955 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-trusted-ca-bundle\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.362011 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.362008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-console-config\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.362779 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.362747 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-console-config\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.363117 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.363093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-oauth-serving-cert\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.363211 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.363093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-service-ca\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.363349 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.363331 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-trusted-ca-bundle\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.365340 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.365319 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-serving-cert\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.365448 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.365427 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-oauth-config\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.370686 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.370666 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpvkj\" (UniqueName: \"kubernetes.io/projected/b9670129-978f-4a45-872f-414790b7e80f-kube-api-access-zpvkj\") pod \"console-d8d46fdcc-qg25j\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.543994 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.543958 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:49.656623 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.656595 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d8d46fdcc-qg25j"] Apr 20 12:16:49.659830 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:16:49.659804 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9670129_978f_4a45_872f_414790b7e80f.slice/crio-934b22f606ad8ff84845516ad22f2e547281bc5e9e9db6b65656b5b02907704a WatchSource:0}: Error finding container 934b22f606ad8ff84845516ad22f2e547281bc5e9e9db6b65656b5b02907704a: Status 404 returned error can't find the container with id 934b22f606ad8ff84845516ad22f2e547281bc5e9e9db6b65656b5b02907704a Apr 20 12:16:49.742981 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.742948 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8d46fdcc-qg25j" event={"ID":"b9670129-978f-4a45-872f-414790b7e80f","Type":"ContainerStarted","Data":"7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2"} Apr 20 12:16:49.743095 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.742986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8d46fdcc-qg25j" event={"ID":"b9670129-978f-4a45-872f-414790b7e80f","Type":"ContainerStarted","Data":"934b22f606ad8ff84845516ad22f2e547281bc5e9e9db6b65656b5b02907704a"} Apr 20 12:16:49.759636 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:49.759595 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d8d46fdcc-qg25j" podStartSLOduration=0.759580852 podStartE2EDuration="759.580852ms" podCreationTimestamp="2026-04-20 12:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:16:49.759296479 +0000 UTC m=+159.262855542" watchObservedRunningTime="2026-04-20 12:16:49.759580852 +0000 UTC m=+159.263139914" Apr 20 12:16:59.544669 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:59.544625 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:59.544669 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:59.544679 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:59.550146 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:59.550121 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:16:59.774860 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:16:59.774834 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:18:05.727294 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.727263 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-785547cccd-4b6ht"] Apr 20 12:18:05.730248 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.730232 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.747658 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.747637 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-785547cccd-4b6ht"] Apr 20 12:18:05.791800 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.791777 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnpdn\" (UniqueName: \"kubernetes.io/projected/a35d9f16-8267-4898-9d09-d9f89df0e470-kube-api-access-tnpdn\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.791890 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.791812 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-service-ca\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.791890 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.791863 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-oauth-config\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.791963 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.791902 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-oauth-serving-cert\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.791963 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.791927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-serving-cert\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.791963 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.791942 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-console-config\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.791963 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.791957 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-trusted-ca-bundle\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.892988 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.892962 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnpdn\" (UniqueName: \"kubernetes.io/projected/a35d9f16-8267-4898-9d09-d9f89df0e470-kube-api-access-tnpdn\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.893167 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.892998 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-service-ca\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.893167 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.893048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-oauth-config\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.893167 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.893098 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-oauth-serving-cert\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.893167 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.893126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-serving-cert\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.893167 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.893148 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-console-config\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.893417 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.893247 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-trusted-ca-bundle\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.893748 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.893723 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-service-ca\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.893748 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.893744 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-console-config\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.893889 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.893784 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-oauth-serving-cert\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.894127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.894104 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-trusted-ca-bundle\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.895591 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.895560 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-oauth-config\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.895676 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.895588 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-serving-cert\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:05.901027 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:05.900988 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnpdn\" (UniqueName: \"kubernetes.io/projected/a35d9f16-8267-4898-9d09-d9f89df0e470-kube-api-access-tnpdn\") pod \"console-785547cccd-4b6ht\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:06.038781 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:06.038754 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:06.158942 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:06.158788 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-785547cccd-4b6ht"] Apr 20 12:18:06.161591 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:18:06.161555 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda35d9f16_8267_4898_9d09_d9f89df0e470.slice/crio-bd00f8f6d48c75195bf5d2d814ee6412bbabd8b074bfa7c7d30912939921c040 WatchSource:0}: Error finding container bd00f8f6d48c75195bf5d2d814ee6412bbabd8b074bfa7c7d30912939921c040: Status 404 returned error can't find the container with id bd00f8f6d48c75195bf5d2d814ee6412bbabd8b074bfa7c7d30912939921c040 Apr 20 12:18:06.944343 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:06.944304 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785547cccd-4b6ht" event={"ID":"a35d9f16-8267-4898-9d09-d9f89df0e470","Type":"ContainerStarted","Data":"dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e"} Apr 20 12:18:06.944696 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:06.944348 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785547cccd-4b6ht" event={"ID":"a35d9f16-8267-4898-9d09-d9f89df0e470","Type":"ContainerStarted","Data":"bd00f8f6d48c75195bf5d2d814ee6412bbabd8b074bfa7c7d30912939921c040"} Apr 20 12:18:06.973037 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:06.972978 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-785547cccd-4b6ht" podStartSLOduration=1.9729635220000001 podStartE2EDuration="1.972963522s" podCreationTimestamp="2026-04-20 12:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:18:06.971423494 +0000 UTC m=+236.474982558" watchObservedRunningTime="2026-04-20 12:18:06.972963522 +0000 UTC m=+236.476522584" Apr 20 12:18:16.039830 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:16.039794 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:16.039830 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:16.039837 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:16.044533 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:16.044510 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:16.978572 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:16.978545 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:18:17.029628 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:17.029597 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d8d46fdcc-qg25j"] Apr 20 12:18:42.047687 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.047597 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d8d46fdcc-qg25j" podUID="b9670129-978f-4a45-872f-414790b7e80f" containerName="console" containerID="cri-o://7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2" gracePeriod=15 Apr 20 12:18:42.290487 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.290467 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d8d46fdcc-qg25j_b9670129-978f-4a45-872f-414790b7e80f/console/0.log" Apr 20 12:18:42.290586 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.290526 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:18:42.329575 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.329511 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-service-ca\") pod \"b9670129-978f-4a45-872f-414790b7e80f\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " Apr 20 12:18:42.329575 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.329568 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-oauth-config\") pod \"b9670129-978f-4a45-872f-414790b7e80f\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " Apr 20 12:18:42.329731 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.329590 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpvkj\" (UniqueName: \"kubernetes.io/projected/b9670129-978f-4a45-872f-414790b7e80f-kube-api-access-zpvkj\") pod \"b9670129-978f-4a45-872f-414790b7e80f\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " Apr 20 12:18:42.329731 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.329607 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-oauth-serving-cert\") pod \"b9670129-978f-4a45-872f-414790b7e80f\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " Apr 20 12:18:42.329731 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.329637 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-trusted-ca-bundle\") pod \"b9670129-978f-4a45-872f-414790b7e80f\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " Apr 20 12:18:42.329731 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.329672 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-serving-cert\") pod \"b9670129-978f-4a45-872f-414790b7e80f\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " Apr 20 12:18:42.329731 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.329698 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-console-config\") pod \"b9670129-978f-4a45-872f-414790b7e80f\" (UID: \"b9670129-978f-4a45-872f-414790b7e80f\") " Apr 20 12:18:42.330101 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.330069 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-service-ca" (OuterVolumeSpecName: "service-ca") pod "b9670129-978f-4a45-872f-414790b7e80f" (UID: "b9670129-978f-4a45-872f-414790b7e80f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:18:42.330212 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.330072 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b9670129-978f-4a45-872f-414790b7e80f" (UID: "b9670129-978f-4a45-872f-414790b7e80f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:18:42.330212 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.330119 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b9670129-978f-4a45-872f-414790b7e80f" (UID: "b9670129-978f-4a45-872f-414790b7e80f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:18:42.330212 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.330196 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-console-config" (OuterVolumeSpecName: "console-config") pod "b9670129-978f-4a45-872f-414790b7e80f" (UID: "b9670129-978f-4a45-872f-414790b7e80f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:18:42.331772 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.331740 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b9670129-978f-4a45-872f-414790b7e80f" (UID: "b9670129-978f-4a45-872f-414790b7e80f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:18:42.331871 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.331777 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9670129-978f-4a45-872f-414790b7e80f-kube-api-access-zpvkj" (OuterVolumeSpecName: "kube-api-access-zpvkj") pod "b9670129-978f-4a45-872f-414790b7e80f" (UID: "b9670129-978f-4a45-872f-414790b7e80f"). InnerVolumeSpecName "kube-api-access-zpvkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:18:42.331871 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.331835 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b9670129-978f-4a45-872f-414790b7e80f" (UID: "b9670129-978f-4a45-872f-414790b7e80f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:18:42.431054 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.431000 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-trusted-ca-bundle\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:18:42.431054 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.431050 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-serving-cert\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:18:42.431054 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.431060 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-console-config\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:18:42.431054 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.431068 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-service-ca\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:18:42.431263 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.431076 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9670129-978f-4a45-872f-414790b7e80f-console-oauth-config\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:18:42.431263 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.431085 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zpvkj\" (UniqueName: \"kubernetes.io/projected/b9670129-978f-4a45-872f-414790b7e80f-kube-api-access-zpvkj\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:18:42.431263 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:42.431093 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9670129-978f-4a45-872f-414790b7e80f-oauth-serving-cert\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:18:43.041708 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.041679 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d8d46fdcc-qg25j_b9670129-978f-4a45-872f-414790b7e80f/console/0.log" Apr 20 12:18:43.041898 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.041724 2570 generic.go:358] "Generic (PLEG): container finished" podID="b9670129-978f-4a45-872f-414790b7e80f" containerID="7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2" exitCode=2 Apr 20 12:18:43.041898 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.041763 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8d46fdcc-qg25j" event={"ID":"b9670129-978f-4a45-872f-414790b7e80f","Type":"ContainerDied","Data":"7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2"} Apr 20 12:18:43.041898 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.041783 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8d46fdcc-qg25j" Apr 20 12:18:43.041898 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.041791 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8d46fdcc-qg25j" event={"ID":"b9670129-978f-4a45-872f-414790b7e80f","Type":"ContainerDied","Data":"934b22f606ad8ff84845516ad22f2e547281bc5e9e9db6b65656b5b02907704a"} Apr 20 12:18:43.041898 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.041811 2570 scope.go:117] "RemoveContainer" containerID="7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2" Apr 20 12:18:43.050013 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.049658 2570 scope.go:117] "RemoveContainer" containerID="7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2" Apr 20 12:18:43.050013 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:18:43.049932 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2\": container with ID starting with 7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2 not found: ID does not exist" containerID="7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2" Apr 20 12:18:43.050013 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.049963 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2"} err="failed to get container status \"7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2\": rpc error: code = NotFound desc = could not find container \"7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2\": container with ID starting with 7cf0b96fba095706a8a3b2057a2f6c11828cd643a725535e4323632ab913c4c2 not found: ID does not exist" Apr 20 12:18:43.062808 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.062782 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d8d46fdcc-qg25j"] Apr 20 12:18:43.064783 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.064764 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d8d46fdcc-qg25j"] Apr 20 12:18:43.137186 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:18:43.137160 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9670129-978f-4a45-872f-414790b7e80f" path="/var/lib/kubelet/pods/b9670129-978f-4a45-872f-414790b7e80f/volumes" Apr 20 12:19:11.005916 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:11.005889 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 12:19:23.344127 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.344082 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g"] Apr 20 12:19:23.346472 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.344333 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9670129-978f-4a45-872f-414790b7e80f" containerName="console" Apr 20 12:19:23.346472 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.344342 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9670129-978f-4a45-872f-414790b7e80f" containerName="console" Apr 20 12:19:23.346472 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.344389 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9670129-978f-4a45-872f-414790b7e80f" containerName="console" Apr 20 12:19:23.347302 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.347286 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.350088 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.349989 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-d2z52\"" Apr 20 12:19:23.350434 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.350197 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 12:19:23.350997 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.350974 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 12:19:23.355055 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.355009 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g"] Apr 20 12:19:23.410653 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.410624 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.410653 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.410652 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.410819 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.410720 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgc8m\" (UniqueName: \"kubernetes.io/projected/f694de2e-45e0-4462-ae2b-48d7ef261c91-kube-api-access-hgc8m\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.511327 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.511296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgc8m\" (UniqueName: \"kubernetes.io/projected/f694de2e-45e0-4462-ae2b-48d7ef261c91-kube-api-access-hgc8m\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.511449 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.511342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.511449 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.511359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.511651 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.511635 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.511690 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.511677 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.519208 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.519183 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgc8m\" (UniqueName: \"kubernetes.io/projected/f694de2e-45e0-4462-ae2b-48d7ef261c91-kube-api-access-hgc8m\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.659379 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.659298 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:23.771312 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.771283 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g"] Apr 20 12:19:23.773920 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:19:23.773892 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf694de2e_45e0_4462_ae2b_48d7ef261c91.slice/crio-cac752b8b6ce9f48a52e757fea9e62994664c6934189dd383cc9a0ba75ff4826 WatchSource:0}: Error finding container cac752b8b6ce9f48a52e757fea9e62994664c6934189dd383cc9a0ba75ff4826: Status 404 returned error can't find the container with id cac752b8b6ce9f48a52e757fea9e62994664c6934189dd383cc9a0ba75ff4826 Apr 20 12:19:23.775662 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:23.775645 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:19:24.145556 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:24.145519 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" event={"ID":"f694de2e-45e0-4462-ae2b-48d7ef261c91","Type":"ContainerStarted","Data":"cac752b8b6ce9f48a52e757fea9e62994664c6934189dd383cc9a0ba75ff4826"} Apr 20 12:19:30.165736 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:30.165702 2570 generic.go:358] "Generic (PLEG): container finished" podID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerID="7ba2da48e7a30c5b9cd5f0c9a78d6d203012359b95a4c8fa86489bb3648662be" exitCode=0 Apr 20 12:19:30.166128 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:30.165773 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" event={"ID":"f694de2e-45e0-4462-ae2b-48d7ef261c91","Type":"ContainerDied","Data":"7ba2da48e7a30c5b9cd5f0c9a78d6d203012359b95a4c8fa86489bb3648662be"} Apr 20 12:19:33.174010 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:33.173974 2570 generic.go:358] "Generic (PLEG): container finished" podID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerID="e240ed2ad64ec9ae4f9763bd5636b3ec8a7d73017ac68ced70750c9916fc753c" exitCode=0 Apr 20 12:19:33.174492 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:33.174054 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" event={"ID":"f694de2e-45e0-4462-ae2b-48d7ef261c91","Type":"ContainerDied","Data":"e240ed2ad64ec9ae4f9763bd5636b3ec8a7d73017ac68ced70750c9916fc753c"} Apr 20 12:19:40.198142 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:40.198102 2570 generic.go:358] "Generic (PLEG): container finished" podID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerID="22bd9c04e1e5eb3c470a01b9200f2e1a1b784c448bcb1ce2a3352e736735359f" exitCode=0 Apr 20 12:19:40.198510 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:40.198190 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" event={"ID":"f694de2e-45e0-4462-ae2b-48d7ef261c91","Type":"ContainerDied","Data":"22bd9c04e1e5eb3c470a01b9200f2e1a1b784c448bcb1ce2a3352e736735359f"} Apr 20 12:19:41.315251 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.315229 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:41.453082 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.452968 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-bundle\") pod \"f694de2e-45e0-4462-ae2b-48d7ef261c91\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " Apr 20 12:19:41.453082 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.453008 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-util\") pod \"f694de2e-45e0-4462-ae2b-48d7ef261c91\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " Apr 20 12:19:41.453082 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.453065 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgc8m\" (UniqueName: \"kubernetes.io/projected/f694de2e-45e0-4462-ae2b-48d7ef261c91-kube-api-access-hgc8m\") pod \"f694de2e-45e0-4462-ae2b-48d7ef261c91\" (UID: \"f694de2e-45e0-4462-ae2b-48d7ef261c91\") " Apr 20 12:19:41.453519 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.453491 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-bundle" (OuterVolumeSpecName: "bundle") pod "f694de2e-45e0-4462-ae2b-48d7ef261c91" (UID: "f694de2e-45e0-4462-ae2b-48d7ef261c91"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:19:41.455178 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.455148 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f694de2e-45e0-4462-ae2b-48d7ef261c91-kube-api-access-hgc8m" (OuterVolumeSpecName: "kube-api-access-hgc8m") pod "f694de2e-45e0-4462-ae2b-48d7ef261c91" (UID: "f694de2e-45e0-4462-ae2b-48d7ef261c91"). InnerVolumeSpecName "kube-api-access-hgc8m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:19:41.457162 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.457140 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-util" (OuterVolumeSpecName: "util") pod "f694de2e-45e0-4462-ae2b-48d7ef261c91" (UID: "f694de2e-45e0-4462-ae2b-48d7ef261c91"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:19:41.553534 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.553513 2570 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-bundle\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:19:41.553534 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.553533 2570 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f694de2e-45e0-4462-ae2b-48d7ef261c91-util\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:19:41.553672 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:41.553543 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hgc8m\" (UniqueName: \"kubernetes.io/projected/f694de2e-45e0-4462-ae2b-48d7ef261c91-kube-api-access-hgc8m\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:19:42.205052 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:42.205008 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" Apr 20 12:19:42.205227 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:42.204997 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hls6g" event={"ID":"f694de2e-45e0-4462-ae2b-48d7ef261c91","Type":"ContainerDied","Data":"cac752b8b6ce9f48a52e757fea9e62994664c6934189dd383cc9a0ba75ff4826"} Apr 20 12:19:42.205227 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:42.205140 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cac752b8b6ce9f48a52e757fea9e62994664c6934189dd383cc9a0ba75ff4826" Apr 20 12:19:45.390115 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.390083 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs"] Apr 20 12:19:45.390482 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.390317 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerName="pull" Apr 20 12:19:45.390482 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.390328 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerName="pull" Apr 20 12:19:45.390482 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.390338 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerName="extract" Apr 20 12:19:45.390482 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.390343 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerName="extract" Apr 20 12:19:45.390482 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.390358 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerName="util" Apr 20 12:19:45.390482 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.390363 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerName="util" Apr 20 12:19:45.390482 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.390402 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f694de2e-45e0-4462-ae2b-48d7ef261c91" containerName="extract" Apr 20 12:19:45.426799 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.426763 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs"] Apr 20 12:19:45.426950 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.426881 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" Apr 20 12:19:45.429491 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.429473 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-9t64q\"" Apr 20 12:19:45.429683 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.429663 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 12:19:45.429766 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.429698 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:19:45.577445 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.577415 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e739f1b8-2307-4833-9736-201241c92ef3-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-wgrrs\" (UID: \"e739f1b8-2307-4833-9736-201241c92ef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" Apr 20 12:19:45.577588 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.577450 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxjf\" (UniqueName: \"kubernetes.io/projected/e739f1b8-2307-4833-9736-201241c92ef3-kube-api-access-lwxjf\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-wgrrs\" (UID: \"e739f1b8-2307-4833-9736-201241c92ef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" Apr 20 12:19:45.678084 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.677996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e739f1b8-2307-4833-9736-201241c92ef3-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-wgrrs\" (UID: \"e739f1b8-2307-4833-9736-201241c92ef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" Apr 20 12:19:45.678084 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.678043 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxjf\" (UniqueName: \"kubernetes.io/projected/e739f1b8-2307-4833-9736-201241c92ef3-kube-api-access-lwxjf\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-wgrrs\" (UID: \"e739f1b8-2307-4833-9736-201241c92ef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" Apr 20 12:19:45.678411 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.678386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e739f1b8-2307-4833-9736-201241c92ef3-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-wgrrs\" (UID: \"e739f1b8-2307-4833-9736-201241c92ef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" Apr 20 12:19:45.687295 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.687268 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxjf\" (UniqueName: \"kubernetes.io/projected/e739f1b8-2307-4833-9736-201241c92ef3-kube-api-access-lwxjf\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-wgrrs\" (UID: \"e739f1b8-2307-4833-9736-201241c92ef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" Apr 20 12:19:45.735829 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.735808 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" Apr 20 12:19:45.858644 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:45.858618 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs"] Apr 20 12:19:45.861387 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:19:45.861349 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode739f1b8_2307_4833_9736_201241c92ef3.slice/crio-35d3fcc274bd29041337e6e921eac3ad97d26333124b7eb00d79ee4d414bbea1 WatchSource:0}: Error finding container 35d3fcc274bd29041337e6e921eac3ad97d26333124b7eb00d79ee4d414bbea1: Status 404 returned error can't find the container with id 35d3fcc274bd29041337e6e921eac3ad97d26333124b7eb00d79ee4d414bbea1 Apr 20 12:19:46.214780 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:46.214747 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" event={"ID":"e739f1b8-2307-4833-9736-201241c92ef3","Type":"ContainerStarted","Data":"35d3fcc274bd29041337e6e921eac3ad97d26333124b7eb00d79ee4d414bbea1"} Apr 20 12:19:49.225475 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:49.225435 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" event={"ID":"e739f1b8-2307-4833-9736-201241c92ef3","Type":"ContainerStarted","Data":"4bf7a8bf0f3aba1a207d813d8171783160099a9e28ac1a25681206fc1dde7581"} Apr 20 12:19:49.245930 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:49.245874 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-wgrrs" podStartSLOduration=1.755817572 podStartE2EDuration="4.245857805s" podCreationTimestamp="2026-04-20 12:19:45 +0000 UTC" firstStartedPulling="2026-04-20 12:19:45.863822687 +0000 UTC m=+335.367381728" lastFinishedPulling="2026-04-20 12:19:48.35386292 +0000 UTC m=+337.857421961" observedRunningTime="2026-04-20 12:19:49.245339998 +0000 UTC m=+338.748899071" watchObservedRunningTime="2026-04-20 12:19:49.245857805 +0000 UTC m=+338.749416868" Apr 20 12:19:52.276216 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.276180 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-v96hf"] Apr 20 12:19:52.279999 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.279983 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:19:52.283056 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.283030 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 12:19:52.283164 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.283039 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-7xcjl\"" Apr 20 12:19:52.284210 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.284187 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 12:19:52.288040 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.288003 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-v96hf"] Apr 20 12:19:52.427444 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.427410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d6e6a97-8546-44ca-8d0a-6536e16b41cf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-v96hf\" (UID: \"0d6e6a97-8546-44ca-8d0a-6536e16b41cf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:19:52.427570 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.427453 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr9nk\" (UniqueName: \"kubernetes.io/projected/0d6e6a97-8546-44ca-8d0a-6536e16b41cf-kube-api-access-vr9nk\") pod \"cert-manager-webhook-597b96b99b-v96hf\" (UID: \"0d6e6a97-8546-44ca-8d0a-6536e16b41cf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:19:52.528833 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.528762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d6e6a97-8546-44ca-8d0a-6536e16b41cf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-v96hf\" (UID: \"0d6e6a97-8546-44ca-8d0a-6536e16b41cf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:19:52.528833 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.528798 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr9nk\" (UniqueName: \"kubernetes.io/projected/0d6e6a97-8546-44ca-8d0a-6536e16b41cf-kube-api-access-vr9nk\") pod \"cert-manager-webhook-597b96b99b-v96hf\" (UID: \"0d6e6a97-8546-44ca-8d0a-6536e16b41cf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:19:52.537060 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.537015 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr9nk\" (UniqueName: \"kubernetes.io/projected/0d6e6a97-8546-44ca-8d0a-6536e16b41cf-kube-api-access-vr9nk\") pod \"cert-manager-webhook-597b96b99b-v96hf\" (UID: \"0d6e6a97-8546-44ca-8d0a-6536e16b41cf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:19:52.537461 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.537439 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d6e6a97-8546-44ca-8d0a-6536e16b41cf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-v96hf\" (UID: \"0d6e6a97-8546-44ca-8d0a-6536e16b41cf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:19:52.600985 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.600960 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:19:52.712316 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:52.712276 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-v96hf"] Apr 20 12:19:52.715173 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:19:52.715140 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d6e6a97_8546_44ca_8d0a_6536e16b41cf.slice/crio-192d507dec0b393dc84a92922723086bab9bd6aca68efe3a1e8f818c6baf9b38 WatchSource:0}: Error finding container 192d507dec0b393dc84a92922723086bab9bd6aca68efe3a1e8f818c6baf9b38: Status 404 returned error can't find the container with id 192d507dec0b393dc84a92922723086bab9bd6aca68efe3a1e8f818c6baf9b38 Apr 20 12:19:53.237322 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:53.237282 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" event={"ID":"0d6e6a97-8546-44ca-8d0a-6536e16b41cf","Type":"ContainerStarted","Data":"192d507dec0b393dc84a92922723086bab9bd6aca68efe3a1e8f818c6baf9b38"} Apr 20 12:19:55.774736 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:55.774706 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-cbvpc"] Apr 20 12:19:55.777828 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:55.777807 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" Apr 20 12:19:55.780125 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:55.780105 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-pb87q\"" Apr 20 12:19:55.786799 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:55.786780 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-cbvpc"] Apr 20 12:19:55.955574 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:55.955534 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b6rr\" (UniqueName: \"kubernetes.io/projected/51d1be25-d39b-4f51-84e0-90949410f308-kube-api-access-5b6rr\") pod \"cert-manager-cainjector-8966b78d4-cbvpc\" (UID: \"51d1be25-d39b-4f51-84e0-90949410f308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" Apr 20 12:19:55.955725 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:55.955643 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51d1be25-d39b-4f51-84e0-90949410f308-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-cbvpc\" (UID: \"51d1be25-d39b-4f51-84e0-90949410f308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" Apr 20 12:19:56.056112 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.056036 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51d1be25-d39b-4f51-84e0-90949410f308-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-cbvpc\" (UID: \"51d1be25-d39b-4f51-84e0-90949410f308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" Apr 20 12:19:56.056112 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.056099 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b6rr\" (UniqueName: \"kubernetes.io/projected/51d1be25-d39b-4f51-84e0-90949410f308-kube-api-access-5b6rr\") pod \"cert-manager-cainjector-8966b78d4-cbvpc\" (UID: \"51d1be25-d39b-4f51-84e0-90949410f308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" Apr 20 12:19:56.063980 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.063956 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51d1be25-d39b-4f51-84e0-90949410f308-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-cbvpc\" (UID: \"51d1be25-d39b-4f51-84e0-90949410f308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" Apr 20 12:19:56.064099 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.064084 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b6rr\" (UniqueName: \"kubernetes.io/projected/51d1be25-d39b-4f51-84e0-90949410f308-kube-api-access-5b6rr\") pod \"cert-manager-cainjector-8966b78d4-cbvpc\" (UID: \"51d1be25-d39b-4f51-84e0-90949410f308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" Apr 20 12:19:56.086763 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.086738 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" Apr 20 12:19:56.197229 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.197201 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-cbvpc"] Apr 20 12:19:56.199626 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:19:56.199599 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51d1be25_d39b_4f51_84e0_90949410f308.slice/crio-f94f38cfd5cea3dfeac88f89f65c0d8ade45b8cb4397e8dc8ecd6da576ea8a36 WatchSource:0}: Error finding container f94f38cfd5cea3dfeac88f89f65c0d8ade45b8cb4397e8dc8ecd6da576ea8a36: Status 404 returned error can't find the container with id f94f38cfd5cea3dfeac88f89f65c0d8ade45b8cb4397e8dc8ecd6da576ea8a36 Apr 20 12:19:56.250451 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.250426 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" event={"ID":"51d1be25-d39b-4f51-84e0-90949410f308","Type":"ContainerStarted","Data":"f94f38cfd5cea3dfeac88f89f65c0d8ade45b8cb4397e8dc8ecd6da576ea8a36"} Apr 20 12:19:56.251637 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.251616 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" event={"ID":"0d6e6a97-8546-44ca-8d0a-6536e16b41cf","Type":"ContainerStarted","Data":"33fdb5d8e1d7a26126a98f917036ec26248774bc5895231e2577db4bde9b25e0"} Apr 20 12:19:56.251771 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.251755 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:19:56.267881 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:56.267842 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" podStartSLOduration=1.686355727 podStartE2EDuration="4.26782833s" podCreationTimestamp="2026-04-20 12:19:52 +0000 UTC" firstStartedPulling="2026-04-20 12:19:52.71699729 +0000 UTC m=+342.220556331" lastFinishedPulling="2026-04-20 12:19:55.298469891 +0000 UTC m=+344.802028934" observedRunningTime="2026-04-20 12:19:56.266748805 +0000 UTC m=+345.770307869" watchObservedRunningTime="2026-04-20 12:19:56.26782833 +0000 UTC m=+345.771387393" Apr 20 12:19:57.255852 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:57.255811 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" event={"ID":"51d1be25-d39b-4f51-84e0-90949410f308","Type":"ContainerStarted","Data":"1725fe199fa82dea014295e92e431f5647e587df235cd7cae052f09d19f914b2"} Apr 20 12:19:57.270512 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:19:57.270470 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-cbvpc" podStartSLOduration=2.270456286 podStartE2EDuration="2.270456286s" podCreationTimestamp="2026-04-20 12:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:19:57.269989001 +0000 UTC m=+346.773548065" watchObservedRunningTime="2026-04-20 12:19:57.270456286 +0000 UTC m=+346.774015349" Apr 20 12:20:02.258110 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:02.258007 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-v96hf" Apr 20 12:20:15.508072 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.508013 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69"] Apr 20 12:20:15.511504 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.511481 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.513920 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.513898 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 12:20:15.514998 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.514980 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-d2z52\"" Apr 20 12:20:15.514998 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.514992 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 12:20:15.519291 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.519257 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69"] Apr 20 12:20:15.603130 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.603101 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.603244 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.603140 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfbx2\" (UniqueName: \"kubernetes.io/projected/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-kube-api-access-nfbx2\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.603244 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.603187 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.704325 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.704295 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.704476 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.704351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.704476 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.704397 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfbx2\" (UniqueName: \"kubernetes.io/projected/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-kube-api-access-nfbx2\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.704714 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.704691 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.704776 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.704726 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.712519 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.712500 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfbx2\" (UniqueName: \"kubernetes.io/projected/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-kube-api-access-nfbx2\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.821277 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.821237 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:15.942141 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:15.942111 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69"] Apr 20 12:20:15.944883 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:20:15.944860 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcb30fa1_8858_4f87_b16d_44d16cfb99aa.slice/crio-3fb21a75f0497d8fba10030f3d36e411409b9cc90f7b719a1cc8b2a635b55e2f WatchSource:0}: Error finding container 3fb21a75f0497d8fba10030f3d36e411409b9cc90f7b719a1cc8b2a635b55e2f: Status 404 returned error can't find the container with id 3fb21a75f0497d8fba10030f3d36e411409b9cc90f7b719a1cc8b2a635b55e2f Apr 20 12:20:16.314599 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:16.314569 2570 generic.go:358] "Generic (PLEG): container finished" podID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerID="abbd0250a770c0f1cdc46df9f0aa44cbe5884128840956279454a2b8163564ed" exitCode=0 Apr 20 12:20:16.314756 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:16.314651 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" event={"ID":"dcb30fa1-8858-4f87-b16d-44d16cfb99aa","Type":"ContainerDied","Data":"abbd0250a770c0f1cdc46df9f0aa44cbe5884128840956279454a2b8163564ed"} Apr 20 12:20:16.314756 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:16.314686 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" event={"ID":"dcb30fa1-8858-4f87-b16d-44d16cfb99aa","Type":"ContainerStarted","Data":"3fb21a75f0497d8fba10030f3d36e411409b9cc90f7b719a1cc8b2a635b55e2f"} Apr 20 12:20:19.325786 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:19.325747 2570 generic.go:358] "Generic (PLEG): container finished" podID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerID="a3298efdc7abfaed064efded84f90aba0b8de849d70812df89791ea79f04b715" exitCode=0 Apr 20 12:20:19.326159 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:19.325814 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" event={"ID":"dcb30fa1-8858-4f87-b16d-44d16cfb99aa","Type":"ContainerDied","Data":"a3298efdc7abfaed064efded84f90aba0b8de849d70812df89791ea79f04b715"} Apr 20 12:20:20.331407 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:20.331372 2570 generic.go:358] "Generic (PLEG): container finished" podID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerID="4b131e707832aaaac92f8de3a376527b959b7d4fad16de3fce603fbe93071756" exitCode=0 Apr 20 12:20:20.331741 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:20.331458 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" event={"ID":"dcb30fa1-8858-4f87-b16d-44d16cfb99aa","Type":"ContainerDied","Data":"4b131e707832aaaac92f8de3a376527b959b7d4fad16de3fce603fbe93071756"} Apr 20 12:20:21.448942 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.448920 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:21.550870 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.550839 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfbx2\" (UniqueName: \"kubernetes.io/projected/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-kube-api-access-nfbx2\") pod \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " Apr 20 12:20:21.551007 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.550877 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-util\") pod \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " Apr 20 12:20:21.551007 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.550916 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-bundle\") pod \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\" (UID: \"dcb30fa1-8858-4f87-b16d-44d16cfb99aa\") " Apr 20 12:20:21.551361 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.551331 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-bundle" (OuterVolumeSpecName: "bundle") pod "dcb30fa1-8858-4f87-b16d-44d16cfb99aa" (UID: "dcb30fa1-8858-4f87-b16d-44d16cfb99aa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:20:21.552884 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.552864 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-kube-api-access-nfbx2" (OuterVolumeSpecName: "kube-api-access-nfbx2") pod "dcb30fa1-8858-4f87-b16d-44d16cfb99aa" (UID: "dcb30fa1-8858-4f87-b16d-44d16cfb99aa"). InnerVolumeSpecName "kube-api-access-nfbx2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:20:21.555555 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.555531 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-util" (OuterVolumeSpecName: "util") pod "dcb30fa1-8858-4f87-b16d-44d16cfb99aa" (UID: "dcb30fa1-8858-4f87-b16d-44d16cfb99aa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:20:21.651881 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.651811 2570 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-bundle\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:20:21.651881 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.651835 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nfbx2\" (UniqueName: \"kubernetes.io/projected/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-kube-api-access-nfbx2\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:20:21.651881 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:21.651844 2570 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcb30fa1-8858-4f87-b16d-44d16cfb99aa-util\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:20:22.341441 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:22.341409 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" event={"ID":"dcb30fa1-8858-4f87-b16d-44d16cfb99aa","Type":"ContainerDied","Data":"3fb21a75f0497d8fba10030f3d36e411409b9cc90f7b719a1cc8b2a635b55e2f"} Apr 20 12:20:22.341441 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:22.341441 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb21a75f0497d8fba10030f3d36e411409b9cc90f7b719a1cc8b2a635b55e2f" Apr 20 12:20:22.341636 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:22.341460 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e2mp69" Apr 20 12:20:38.716555 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.716514 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7"] Apr 20 12:20:38.717204 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.716772 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerName="extract" Apr 20 12:20:38.717204 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.716782 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerName="extract" Apr 20 12:20:38.717204 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.716796 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerName="util" Apr 20 12:20:38.717204 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.716801 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerName="util" Apr 20 12:20:38.717204 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.716808 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerName="pull" Apr 20 12:20:38.717204 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.716813 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerName="pull" Apr 20 12:20:38.717204 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.716874 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcb30fa1-8858-4f87-b16d-44d16cfb99aa" containerName="extract" Apr 20 12:20:38.719659 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.719643 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.723309 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.723284 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:20:38.723441 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.723342 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 20 12:20:38.723441 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.723366 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 20 12:20:38.723441 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.723349 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 20 12:20:38.723441 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.723349 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 20 12:20:38.723633 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.723307 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-9pj5d\"" Apr 20 12:20:38.728276 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.728254 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7"] Apr 20 12:20:38.770523 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.770498 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlwj7\" (UniqueName: \"kubernetes.io/projected/6d46f1ef-1413-47ca-93ae-d1a3a202110e-kube-api-access-rlwj7\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.770642 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.770538 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46f1ef-1413-47ca-93ae-d1a3a202110e-cert\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.770642 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.770560 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6d46f1ef-1413-47ca-93ae-d1a3a202110e-manager-config\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.770642 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.770627 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d46f1ef-1413-47ca-93ae-d1a3a202110e-metrics-certs\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.871769 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.871744 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d46f1ef-1413-47ca-93ae-d1a3a202110e-metrics-certs\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.871881 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.871779 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlwj7\" (UniqueName: \"kubernetes.io/projected/6d46f1ef-1413-47ca-93ae-d1a3a202110e-kube-api-access-rlwj7\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.871881 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.871809 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46f1ef-1413-47ca-93ae-d1a3a202110e-cert\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.871881 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.871839 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6d46f1ef-1413-47ca-93ae-d1a3a202110e-manager-config\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.872516 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.872496 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6d46f1ef-1413-47ca-93ae-d1a3a202110e-manager-config\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.874222 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.874202 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d46f1ef-1413-47ca-93ae-d1a3a202110e-metrics-certs\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.874296 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.874261 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46f1ef-1413-47ca-93ae-d1a3a202110e-cert\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:38.878499 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:38.878479 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlwj7\" (UniqueName: \"kubernetes.io/projected/6d46f1ef-1413-47ca-93ae-d1a3a202110e-kube-api-access-rlwj7\") pod \"jobset-controller-manager-66d47645fc-9c6f7\" (UID: \"6d46f1ef-1413-47ca-93ae-d1a3a202110e\") " pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:39.029071 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:39.029041 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:39.143784 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:39.143758 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7"] Apr 20 12:20:39.145666 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:20:39.145639 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d46f1ef_1413_47ca_93ae_d1a3a202110e.slice/crio-520e23cbbdf710c46570396e66c6df646ec10d347e08ba5a9111256142fd6b7f WatchSource:0}: Error finding container 520e23cbbdf710c46570396e66c6df646ec10d347e08ba5a9111256142fd6b7f: Status 404 returned error can't find the container with id 520e23cbbdf710c46570396e66c6df646ec10d347e08ba5a9111256142fd6b7f Apr 20 12:20:39.398305 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:39.398216 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" event={"ID":"6d46f1ef-1413-47ca-93ae-d1a3a202110e","Type":"ContainerStarted","Data":"520e23cbbdf710c46570396e66c6df646ec10d347e08ba5a9111256142fd6b7f"} Apr 20 12:20:42.408070 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:42.407966 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" event={"ID":"6d46f1ef-1413-47ca-93ae-d1a3a202110e","Type":"ContainerStarted","Data":"2ab9a69736f5feb42035d31e6561a147a4799bcf97851b38ee6c2ab71c37249a"} Apr 20 12:20:42.408468 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:42.408102 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:20:42.424221 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:42.424173 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" podStartSLOduration=1.49095006 podStartE2EDuration="4.424159942s" podCreationTimestamp="2026-04-20 12:20:38 +0000 UTC" firstStartedPulling="2026-04-20 12:20:39.147067995 +0000 UTC m=+388.650627036" lastFinishedPulling="2026-04-20 12:20:42.080277877 +0000 UTC m=+391.583836918" observedRunningTime="2026-04-20 12:20:42.42372731 +0000 UTC m=+391.927286373" watchObservedRunningTime="2026-04-20 12:20:42.424159942 +0000 UTC m=+391.927719020" Apr 20 12:20:53.416230 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:20:53.416195 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-66d47645fc-9c6f7" Apr 20 12:22:06.415325 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.415291 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7999b796d4-xtlb8"] Apr 20 12:22:06.418571 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.418550 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.428650 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.428629 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7999b796d4-xtlb8"] Apr 20 12:22:06.508567 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.508542 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-console-oauth-config\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.508691 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.508599 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-console-serving-cert\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.508691 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.508654 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-console-config\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.508760 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.508700 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-oauth-serving-cert\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.508760 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.508725 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-service-ca\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.508760 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.508745 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-trusted-ca-bundle\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.508871 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.508763 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4ql2\" (UniqueName: \"kubernetes.io/projected/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-kube-api-access-z4ql2\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.609837 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.609811 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-console-serving-cert\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.609980 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.609843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-console-config\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.609980 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.609865 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-oauth-serving-cert\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.609980 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.609884 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-service-ca\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.609980 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.609903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-trusted-ca-bundle\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.609980 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.609922 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4ql2\" (UniqueName: \"kubernetes.io/projected/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-kube-api-access-z4ql2\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.609980 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.609952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-console-oauth-config\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.610756 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.610727 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-service-ca\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.610870 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.610727 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-oauth-serving-cert\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.610870 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.610794 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-console-config\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.610870 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.610802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-trusted-ca-bundle\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.612320 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.612295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-console-oauth-config\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.612419 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.612401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-console-serving-cert\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.618931 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.618908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4ql2\" (UniqueName: \"kubernetes.io/projected/2cd001d7-ab0f-4eef-86cc-cb6d202afeb3-kube-api-access-z4ql2\") pod \"console-7999b796d4-xtlb8\" (UID: \"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3\") " pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.726734 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.726677 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:06.843479 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:06.843388 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7999b796d4-xtlb8"] Apr 20 12:22:06.845997 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:22:06.845969 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd001d7_ab0f_4eef_86cc_cb6d202afeb3.slice/crio-012029848e4032de020d84142c416863070995f42ca220917006273c871e672c WatchSource:0}: Error finding container 012029848e4032de020d84142c416863070995f42ca220917006273c871e672c: Status 404 returned error can't find the container with id 012029848e4032de020d84142c416863070995f42ca220917006273c871e672c Apr 20 12:22:07.668234 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:07.668195 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7999b796d4-xtlb8" event={"ID":"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3","Type":"ContainerStarted","Data":"ca1e9f413d7461d8084e6de475ef679a686831ae1085ad7b47b27940f2bc2f21"} Apr 20 12:22:07.668234 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:07.668231 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7999b796d4-xtlb8" event={"ID":"2cd001d7-ab0f-4eef-86cc-cb6d202afeb3","Type":"ContainerStarted","Data":"012029848e4032de020d84142c416863070995f42ca220917006273c871e672c"} Apr 20 12:22:07.686255 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:07.686212 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7999b796d4-xtlb8" podStartSLOduration=1.686199324 podStartE2EDuration="1.686199324s" podCreationTimestamp="2026-04-20 12:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:22:07.684190355 +0000 UTC m=+477.187749417" watchObservedRunningTime="2026-04-20 12:22:07.686199324 +0000 UTC m=+477.189758386" Apr 20 12:22:16.727262 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:16.727226 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:16.727655 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:16.727272 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:16.731960 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:16.731939 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:17.701860 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:17.701830 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7999b796d4-xtlb8" Apr 20 12:22:17.746823 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:17.746783 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-785547cccd-4b6ht"] Apr 20 12:22:42.766547 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:42.766490 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-785547cccd-4b6ht" podUID="a35d9f16-8267-4898-9d09-d9f89df0e470" containerName="console" containerID="cri-o://dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e" gracePeriod=15 Apr 20 12:22:42.997445 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:42.997424 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-785547cccd-4b6ht_a35d9f16-8267-4898-9d09-d9f89df0e470/console/0.log" Apr 20 12:22:42.997544 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:42.997485 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:22:43.076055 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.075979 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-trusted-ca-bundle\") pod \"a35d9f16-8267-4898-9d09-d9f89df0e470\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " Apr 20 12:22:43.076055 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076034 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-serving-cert\") pod \"a35d9f16-8267-4898-9d09-d9f89df0e470\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " Apr 20 12:22:43.076055 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076054 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-service-ca\") pod \"a35d9f16-8267-4898-9d09-d9f89df0e470\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " Apr 20 12:22:43.076274 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076086 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-oauth-serving-cert\") pod \"a35d9f16-8267-4898-9d09-d9f89df0e470\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " Apr 20 12:22:43.076274 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076131 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-oauth-config\") pod \"a35d9f16-8267-4898-9d09-d9f89df0e470\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " Apr 20 12:22:43.076274 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076170 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnpdn\" (UniqueName: \"kubernetes.io/projected/a35d9f16-8267-4898-9d09-d9f89df0e470-kube-api-access-tnpdn\") pod \"a35d9f16-8267-4898-9d09-d9f89df0e470\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " Apr 20 12:22:43.076274 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076186 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-console-config\") pod \"a35d9f16-8267-4898-9d09-d9f89df0e470\" (UID: \"a35d9f16-8267-4898-9d09-d9f89df0e470\") " Apr 20 12:22:43.076460 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076407 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a35d9f16-8267-4898-9d09-d9f89df0e470" (UID: "a35d9f16-8267-4898-9d09-d9f89df0e470"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:22:43.076629 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076591 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-service-ca" (OuterVolumeSpecName: "service-ca") pod "a35d9f16-8267-4898-9d09-d9f89df0e470" (UID: "a35d9f16-8267-4898-9d09-d9f89df0e470"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:22:43.076629 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076596 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a35d9f16-8267-4898-9d09-d9f89df0e470" (UID: "a35d9f16-8267-4898-9d09-d9f89df0e470"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:22:43.076629 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.076616 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-console-config" (OuterVolumeSpecName: "console-config") pod "a35d9f16-8267-4898-9d09-d9f89df0e470" (UID: "a35d9f16-8267-4898-9d09-d9f89df0e470"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:22:43.078179 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.078155 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a35d9f16-8267-4898-9d09-d9f89df0e470" (UID: "a35d9f16-8267-4898-9d09-d9f89df0e470"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:22:43.078279 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.078223 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a35d9f16-8267-4898-9d09-d9f89df0e470" (UID: "a35d9f16-8267-4898-9d09-d9f89df0e470"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:22:43.078335 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.078285 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a35d9f16-8267-4898-9d09-d9f89df0e470-kube-api-access-tnpdn" (OuterVolumeSpecName: "kube-api-access-tnpdn") pod "a35d9f16-8267-4898-9d09-d9f89df0e470" (UID: "a35d9f16-8267-4898-9d09-d9f89df0e470"). InnerVolumeSpecName "kube-api-access-tnpdn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:22:43.177335 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.177312 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-oauth-config\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:22:43.177335 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.177333 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tnpdn\" (UniqueName: \"kubernetes.io/projected/a35d9f16-8267-4898-9d09-d9f89df0e470-kube-api-access-tnpdn\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:22:43.177450 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.177344 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-console-config\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:22:43.177450 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.177353 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-trusted-ca-bundle\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:22:43.177450 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.177363 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a35d9f16-8267-4898-9d09-d9f89df0e470-console-serving-cert\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:22:43.177450 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.177372 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-service-ca\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:22:43.177450 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.177379 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a35d9f16-8267-4898-9d09-d9f89df0e470-oauth-serving-cert\") on node \"ip-10-0-135-187.ec2.internal\" DevicePath \"\"" Apr 20 12:22:43.781955 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.781931 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-785547cccd-4b6ht_a35d9f16-8267-4898-9d09-d9f89df0e470/console/0.log" Apr 20 12:22:43.782401 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.781967 2570 generic.go:358] "Generic (PLEG): container finished" podID="a35d9f16-8267-4898-9d09-d9f89df0e470" containerID="dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e" exitCode=2 Apr 20 12:22:43.782401 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.781996 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785547cccd-4b6ht" event={"ID":"a35d9f16-8267-4898-9d09-d9f89df0e470","Type":"ContainerDied","Data":"dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e"} Apr 20 12:22:43.782401 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.782040 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785547cccd-4b6ht" event={"ID":"a35d9f16-8267-4898-9d09-d9f89df0e470","Type":"ContainerDied","Data":"bd00f8f6d48c75195bf5d2d814ee6412bbabd8b074bfa7c7d30912939921c040"} Apr 20 12:22:43.782401 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.782048 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785547cccd-4b6ht" Apr 20 12:22:43.782401 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.782065 2570 scope.go:117] "RemoveContainer" containerID="dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e" Apr 20 12:22:43.789466 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.789447 2570 scope.go:117] "RemoveContainer" containerID="dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e" Apr 20 12:22:43.789701 ip-10-0-135-187 kubenswrapper[2570]: E0420 12:22:43.789680 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e\": container with ID starting with dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e not found: ID does not exist" containerID="dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e" Apr 20 12:22:43.789767 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.789711 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e"} err="failed to get container status \"dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e\": rpc error: code = NotFound desc = could not find container \"dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e\": container with ID starting with dfc1dab5a51c30a31b1170bb8c2bfed630fce5236fb218cefc2813eef0808a9e not found: ID does not exist" Apr 20 12:22:43.798932 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.798905 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-785547cccd-4b6ht"] Apr 20 12:22:43.802687 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:43.802664 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-785547cccd-4b6ht"] Apr 20 12:22:45.138464 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:22:45.138425 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a35d9f16-8267-4898-9d09-d9f89df0e470" path="/var/lib/kubelet/pods/a35d9f16-8267-4898-9d09-d9f89df0e470/volumes" Apr 20 12:59:27.742949 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:27.742920 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2zqds_32b53c98-449d-4d6d-9ec4-06d42d60860e/global-pull-secret-syncer/0.log" Apr 20 12:59:27.913218 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:27.913164 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bszzh_d45ac98c-ea8f-4ad5-8034-4c9981d9693a/konnectivity-agent/0.log" Apr 20 12:59:27.984537 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:27.984507 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-187.ec2.internal_5dfa79b55f33a77023d71de337bc06ee/haproxy/0.log" Apr 20 12:59:31.436907 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:31.436880 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4jct8_8322defd-f262-40d7-95f0-1e747662e436/node-exporter/0.log" Apr 20 12:59:31.460785 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:31.460744 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4jct8_8322defd-f262-40d7-95f0-1e747662e436/kube-rbac-proxy/0.log" Apr 20 12:59:31.487531 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:31.487501 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4jct8_8322defd-f262-40d7-95f0-1e747662e436/init-textfile/0.log" Apr 20 12:59:31.887137 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:31.887093 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vmbsf_7a13f073-4cfd-494b-8515-5ab165e362bf/prometheus-operator/0.log" Apr 20 12:59:31.908523 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:31.908500 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vmbsf_7a13f073-4cfd-494b-8515-5ab165e362bf/kube-rbac-proxy/0.log" Apr 20 12:59:31.934072 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:31.934048 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-jxh2t_5f94d305-89ae-4c03-800a-7c2bce1b4fd1/prometheus-operator-admission-webhook/0.log" Apr 20 12:59:33.952589 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:33.952557 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7999b796d4-xtlb8_2cd001d7-ab0f-4eef-86cc-cb6d202afeb3/console/0.log" Apr 20 12:59:33.981875 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:33.981848 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-86khj_3b4b233a-eebb-419f-81c4-b65496b65e9b/download-server/0.log" Apr 20 12:59:34.367667 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.367637 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh"] Apr 20 12:59:34.367977 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.367960 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a35d9f16-8267-4898-9d09-d9f89df0e470" containerName="console" Apr 20 12:59:34.368079 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.367980 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35d9f16-8267-4898-9d09-d9f89df0e470" containerName="console" Apr 20 12:59:34.368079 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.368070 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a35d9f16-8267-4898-9d09-d9f89df0e470" containerName="console" Apr 20 12:59:34.371014 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.370994 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.373412 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.373393 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chpds\"/\"openshift-service-ca.crt\"" Apr 20 12:59:34.374371 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.374353 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-chpds\"/\"default-dockercfg-pv8sl\"" Apr 20 12:59:34.374472 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.374424 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chpds\"/\"kube-root-ca.crt\"" Apr 20 12:59:34.379236 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.379216 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh"] Apr 20 12:59:34.517971 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.517937 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjvtv\" (UniqueName: \"kubernetes.io/projected/9290e6ad-1be0-439d-899b-bd8a5cc02404-kube-api-access-kjvtv\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.518141 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.517986 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-lib-modules\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.518141 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.518028 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-sys\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.518141 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.518071 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-proc\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.518141 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.518086 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-podres\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.619186 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.619105 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-proc\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.619186 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.619137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-podres\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.619186 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.619163 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjvtv\" (UniqueName: \"kubernetes.io/projected/9290e6ad-1be0-439d-899b-bd8a5cc02404-kube-api-access-kjvtv\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.619414 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.619196 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-lib-modules\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.619414 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.619223 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-sys\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.619414 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.619238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-proc\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.619414 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.619279 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-podres\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.619414 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.619310 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-sys\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.619414 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.619347 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9290e6ad-1be0-439d-899b-bd8a5cc02404-lib-modules\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.626666 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.626645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjvtv\" (UniqueName: \"kubernetes.io/projected/9290e6ad-1be0-439d-899b-bd8a5cc02404-kube-api-access-kjvtv\") pod \"perf-node-gather-daemonset-qfdrh\" (UID: \"9290e6ad-1be0-439d-899b-bd8a5cc02404\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.681730 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.681708 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:34.794648 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.794624 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh"] Apr 20 12:59:34.796889 ip-10-0-135-187 kubenswrapper[2570]: W0420 12:59:34.796859 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9290e6ad_1be0_439d_899b_bd8a5cc02404.slice/crio-c0e7c6c80464ac1eabf43eccdbc1c140956736fe4af1799d31daf9e59ba65669 WatchSource:0}: Error finding container c0e7c6c80464ac1eabf43eccdbc1c140956736fe4af1799d31daf9e59ba65669: Status 404 returned error can't find the container with id c0e7c6c80464ac1eabf43eccdbc1c140956736fe4af1799d31daf9e59ba65669 Apr 20 12:59:34.798385 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:34.798369 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:59:35.062648 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:35.062622 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nrx42_0f53da55-04fb-46ea-a138-a50e7b354151/dns/0.log" Apr 20 12:59:35.084682 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:35.084662 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nrx42_0f53da55-04fb-46ea-a138-a50e7b354151/kube-rbac-proxy/0.log" Apr 20 12:59:35.106336 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:35.106312 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9zdrr_751a35e3-65ef-4efa-80f2-9cecd4a7b003/dns-node-resolver/0.log" Apr 20 12:59:35.517364 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:35.517336 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6cfd894b8b-drw6h_aef389fc-fc83-4809-871c-658c7804d8e6/registry/0.log" Apr 20 12:59:35.541842 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:35.541818 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" event={"ID":"9290e6ad-1be0-439d-899b-bd8a5cc02404","Type":"ContainerStarted","Data":"cacf60929ac58a19c70a5fc106c9ffcd34ad1a781e5b5b5f570541bfb7de7314"} Apr 20 12:59:35.541954 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:35.541847 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" event={"ID":"9290e6ad-1be0-439d-899b-bd8a5cc02404","Type":"ContainerStarted","Data":"c0e7c6c80464ac1eabf43eccdbc1c140956736fe4af1799d31daf9e59ba65669"} Apr 20 12:59:35.541954 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:35.541871 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:35.557521 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:35.557481 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" podStartSLOduration=1.557469118 podStartE2EDuration="1.557469118s" podCreationTimestamp="2026-04-20 12:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:59:35.555841547 +0000 UTC m=+2725.059400621" watchObservedRunningTime="2026-04-20 12:59:35.557469118 +0000 UTC m=+2725.061028180" Apr 20 12:59:35.560328 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:35.560306 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x4k8h_3f8fe785-c3f9-4d97-9683-833f64ab21aa/node-ca/0.log" Apr 20 12:59:36.565577 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:36.565553 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-q7wvt_5ffce074-ab2a-4172-a6a3-7e85c82f6eb8/serve-healthcheck-canary/0.log" Apr 20 12:59:36.969465 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:36.969376 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6kp8w_c2666143-2c9f-4812-bb5e-895b1c4c4891/kube-rbac-proxy/0.log" Apr 20 12:59:36.988765 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:36.988734 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6kp8w_c2666143-2c9f-4812-bb5e-895b1c4c4891/exporter/0.log" Apr 20 12:59:37.008178 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:37.008159 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6kp8w_c2666143-2c9f-4812-bb5e-895b1c4c4891/extractor/0.log" Apr 20 12:59:38.643747 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:38.643716 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-66d47645fc-9c6f7_6d46f1ef-1413-47ca-93ae-d1a3a202110e/manager/0.log" Apr 20 12:59:41.553568 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:41.553545 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qfdrh" Apr 20 12:59:41.783043 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:41.782997 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bfd6p_0a26209c-50bb-48d5-947b-95041bb256df/migrator/0.log" Apr 20 12:59:41.815434 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:41.815376 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bfd6p_0a26209c-50bb-48d5-947b-95041bb256df/graceful-termination/0.log" Apr 20 12:59:43.438609 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.438583 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbrg9_865bf984-f3c6-4787-a88b-65144a2e4549/kube-multus-additional-cni-plugins/0.log" Apr 20 12:59:43.460715 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.460695 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbrg9_865bf984-f3c6-4787-a88b-65144a2e4549/egress-router-binary-copy/0.log" Apr 20 12:59:43.482585 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.482565 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbrg9_865bf984-f3c6-4787-a88b-65144a2e4549/cni-plugins/0.log" Apr 20 12:59:43.503821 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.503804 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbrg9_865bf984-f3c6-4787-a88b-65144a2e4549/bond-cni-plugin/0.log" Apr 20 12:59:43.526124 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.526108 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbrg9_865bf984-f3c6-4787-a88b-65144a2e4549/routeoverride-cni/0.log" Apr 20 12:59:43.552746 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.552722 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbrg9_865bf984-f3c6-4787-a88b-65144a2e4549/whereabouts-cni-bincopy/0.log" Apr 20 12:59:43.578514 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.578492 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbrg9_865bf984-f3c6-4787-a88b-65144a2e4549/whereabouts-cni/0.log" Apr 20 12:59:43.644944 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.644921 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vrsq9_f08f930e-1834-40c7-9e3c-4cfd5402147c/kube-multus/0.log" Apr 20 12:59:43.666106 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.666055 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8xfxk_faa17428-5484-40d3-9fb5-b11e5a64f1be/network-metrics-daemon/0.log" Apr 20 12:59:43.696942 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:43.696924 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8xfxk_faa17428-5484-40d3-9fb5-b11e5a64f1be/kube-rbac-proxy/0.log" Apr 20 12:59:44.763263 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:44.763235 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pw7dn_a627833c-c9d5-450e-a3be-cae5d2eed758/ovn-controller/0.log" Apr 20 12:59:44.797234 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:44.797210 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pw7dn_a627833c-c9d5-450e-a3be-cae5d2eed758/ovn-acl-logging/0.log" Apr 20 12:59:44.813846 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:44.813828 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pw7dn_a627833c-c9d5-450e-a3be-cae5d2eed758/kube-rbac-proxy-node/0.log" Apr 20 12:59:44.833436 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:44.833414 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pw7dn_a627833c-c9d5-450e-a3be-cae5d2eed758/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 12:59:44.852280 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:44.852262 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pw7dn_a627833c-c9d5-450e-a3be-cae5d2eed758/northd/0.log" Apr 20 12:59:44.873118 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:44.873093 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pw7dn_a627833c-c9d5-450e-a3be-cae5d2eed758/nbdb/0.log" Apr 20 12:59:44.892306 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:44.892289 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pw7dn_a627833c-c9d5-450e-a3be-cae5d2eed758/sbdb/0.log" Apr 20 12:59:44.979122 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:44.979082 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pw7dn_a627833c-c9d5-450e-a3be-cae5d2eed758/ovnkube-controller/0.log" Apr 20 12:59:46.274192 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:46.274158 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hf6xh_6b4ca455-a400-4bf3-8bd0-0b93d1456970/network-check-target-container/0.log" Apr 20 12:59:47.204600 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:47.204571 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-t7cr8_d4423a85-bbfb-4bda-a5c0-8729c2068a9b/iptables-alerter/0.log" Apr 20 12:59:47.800494 ip-10-0-135-187 kubenswrapper[2570]: I0420 12:59:47.800469 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9nxbw_c68b8adf-d528-45fe-9c38-cc642642a5aa/tuned/0.log"