Apr 21 10:01:12.355698 ip-10-0-140-144 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 10:01:12.355708 ip-10-0-140-144 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 10:01:12.355715 ip-10-0-140-144 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 10:01:12.355939 ip-10-0-140-144 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 10:01:22.448551 ip-10-0-140-144 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 10:01:22.448572 ip-10-0-140-144 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7834229c4f0f47d697db1014852385a8 -- Apr 21 10:03:50.484864 ip-10-0-140-144 systemd[1]: Starting Kubernetes Kubelet... Apr 21 10:03:50.900312 ip-10-0-140-144 kubenswrapper[2543]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:50.900312 ip-10-0-140-144 kubenswrapper[2543]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 10:03:50.900312 ip-10-0-140-144 kubenswrapper[2543]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:50.900312 ip-10-0-140-144 kubenswrapper[2543]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 10:03:50.900312 ip-10-0-140-144 kubenswrapper[2543]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:50.902797 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.902711 2543 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 10:03:50.905170 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905155 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:50.905170 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905171 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905174 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905177 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905180 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905183 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905186 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905189 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905191 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905194 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905197 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905200 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905203 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905205 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905208 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905210 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905218 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905221 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905224 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905226 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905229 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:50.905251 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905232 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905234 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905237 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905239 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905242 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905245 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905247 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905249 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905252 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905255 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905258 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905261 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905263 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905266 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905269 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905271 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905275 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905279 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905281 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:50.905733 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905285 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905288 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905290 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905292 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905295 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905297 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905300 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905302 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905305 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905307 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905309 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905313 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905317 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905319 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905322 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905326 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905329 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905332 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905335 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:50.906194 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905337 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905340 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905342 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905345 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905348 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905350 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905352 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905356 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905358 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905360 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905363 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905365 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905367 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905370 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905373 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905376 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905379 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905381 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905383 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905386 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:50.906701 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905388 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905390 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905393 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905395 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905398 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905401 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905403 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905819 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905826 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905829 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905832 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905834 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905837 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905839 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905842 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905845 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905848 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905850 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905853 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:50.907173 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905856 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905859 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905861 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905864 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905866 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905869 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905871 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905874 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905876 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905879 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905881 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905884 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905886 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905889 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905891 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905893 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905897 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905899 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905902 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905904 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:50.907657 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905907 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905910 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905912 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905915 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905917 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905920 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905922 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905925 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905927 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905929 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905932 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905934 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905937 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905940 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905942 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905946 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905950 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905953 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905956 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:50.908147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905958 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905961 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905964 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905967 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905969 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905972 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905975 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905977 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905980 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905982 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905984 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905987 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905989 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905995 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.905998 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906001 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906003 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906005 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906008 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906010 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:50.908621 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906012 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906015 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906017 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906020 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906022 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906025 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906027 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906031 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906033 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906036 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906038 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906040 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906043 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906045 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.906048 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907533 2543 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907555 2543 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907562 2543 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907568 2543 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907572 2543 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907575 2543 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 10:03:50.909100 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907580 2543 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907584 2543 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907588 2543 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907591 2543 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907598 2543 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907602 2543 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907605 2543 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907608 2543 flags.go:64] FLAG: --cgroup-root="" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907611 2543 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907614 2543 flags.go:64] FLAG: --client-ca-file="" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907617 2543 flags.go:64] FLAG: --cloud-config="" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907619 2543 flags.go:64] FLAG: --cloud-provider="external" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907622 2543 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907631 2543 flags.go:64] FLAG: --cluster-domain="" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907633 2543 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907637 2543 flags.go:64] FLAG: --config-dir="" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907639 2543 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907642 2543 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907646 2543 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907655 2543 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907658 2543 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907662 2543 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907665 2543 flags.go:64] FLAG: --contention-profiling="false" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907668 2543 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 10:03:50.909637 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907671 2543 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907674 2543 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907677 2543 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907682 2543 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907684 2543 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907687 2543 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907690 2543 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907693 2543 flags.go:64] FLAG: --enable-server="true" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907696 2543 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907703 2543 flags.go:64] FLAG: --event-burst="100" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907706 2543 flags.go:64] FLAG: --event-qps="50" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907709 2543 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907713 2543 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907716 2543 flags.go:64] FLAG: --eviction-hard="" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907719 2543 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907722 2543 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907725 2543 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907728 2543 flags.go:64] FLAG: --eviction-soft="" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907731 2543 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907734 2543 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907737 2543 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907740 2543 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907742 2543 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907745 2543 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907748 2543 flags.go:64] FLAG: --feature-gates="" Apr 21 10:03:50.910197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907752 2543 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907755 2543 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907757 2543 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907767 2543 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907770 2543 flags.go:64] FLAG: --healthz-port="10248" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907773 2543 flags.go:64] FLAG: --help="false" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907776 2543 flags.go:64] FLAG: --hostname-override="ip-10-0-140-144.ec2.internal" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907779 2543 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907781 2543 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907784 2543 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907788 2543 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907791 2543 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907794 2543 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907797 2543 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907800 2543 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907803 2543 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907805 2543 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907808 2543 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907811 2543 flags.go:64] FLAG: --kube-reserved="" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907815 2543 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907818 2543 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907821 2543 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907823 2543 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907826 2543 flags.go:64] FLAG: --lock-file="" Apr 21 10:03:50.910817 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907829 2543 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907831 2543 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907834 2543 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907840 2543 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907843 2543 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907845 2543 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907848 2543 flags.go:64] FLAG: --logging-format="text" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907851 2543 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907854 2543 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907857 2543 flags.go:64] FLAG: --manifest-url="" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907860 2543 flags.go:64] FLAG: --manifest-url-header="" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907865 2543 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907875 2543 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907879 2543 flags.go:64] FLAG: --max-pods="110" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907882 2543 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907885 2543 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907888 2543 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907890 2543 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907893 2543 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907896 2543 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907900 2543 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907907 2543 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907910 2543 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907913 2543 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 10:03:50.911408 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907916 2543 flags.go:64] FLAG: --pod-cidr="" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907919 2543 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907924 2543 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907927 2543 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907931 2543 flags.go:64] FLAG: --pods-per-core="0" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907934 2543 flags.go:64] FLAG: --port="10250" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907937 2543 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907940 2543 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-088eba366e041b88a" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907942 2543 flags.go:64] FLAG: --qos-reserved="" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907945 2543 flags.go:64] FLAG: --read-only-port="10255" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907948 2543 flags.go:64] FLAG: --register-node="true" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907951 2543 flags.go:64] FLAG: --register-schedulable="true" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907954 2543 flags.go:64] FLAG: --register-with-taints="" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907957 2543 flags.go:64] FLAG: --registry-burst="10" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907960 2543 flags.go:64] FLAG: --registry-qps="5" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907963 2543 flags.go:64] FLAG: --reserved-cpus="" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907965 2543 flags.go:64] FLAG: --reserved-memory="" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907969 2543 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907972 2543 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907975 2543 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907978 2543 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907981 2543 flags.go:64] FLAG: --runonce="false" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907984 2543 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907987 2543 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907990 2543 flags.go:64] FLAG: --seccomp-default="false" Apr 21 10:03:50.912116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907992 2543 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.907995 2543 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908000 2543 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908003 2543 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908006 2543 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908009 2543 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908012 2543 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908014 2543 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908017 2543 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908020 2543 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908023 2543 flags.go:64] FLAG: --system-cgroups="" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908027 2543 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908033 2543 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908035 2543 flags.go:64] FLAG: --tls-cert-file="" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908038 2543 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908043 2543 flags.go:64] FLAG: --tls-min-version="" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908046 2543 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908048 2543 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908051 2543 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908054 2543 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908057 2543 flags.go:64] FLAG: --v="2" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908061 2543 flags.go:64] FLAG: --version="false" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908065 2543 flags.go:64] FLAG: --vmodule="" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908070 2543 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908073 2543 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 10:03:50.912717 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908179 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908183 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908187 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908191 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908195 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908198 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908200 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908203 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908205 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908209 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908212 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908214 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908217 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908220 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908222 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908225 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908227 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908229 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908233 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908235 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:50.913346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908238 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908241 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908243 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908246 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908248 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908251 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908253 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908256 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908259 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908261 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908263 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908266 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908268 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908271 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908273 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908276 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908278 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908281 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908283 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908286 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:50.913897 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908288 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908292 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908294 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908297 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908299 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908302 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908304 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908307 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908309 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908312 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908315 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908318 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908320 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908323 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908325 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908328 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908330 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908333 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908335 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:50.914391 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908339 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908343 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908345 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908348 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908350 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908353 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908355 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908358 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908360 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908362 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908365 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908368 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908370 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908373 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908376 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908379 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908381 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908384 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908386 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908389 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:50.914903 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908391 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:50.915399 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908393 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:50.915399 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908396 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:50.915399 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908399 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:50.915399 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908402 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:50.915399 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908404 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:50.915399 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.908407 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:50.915399 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.908416 2543 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:50.916026 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.916007 2543 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 10:03:50.916055 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.916027 2543 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 10:03:50.916088 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916084 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916090 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916094 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916097 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916101 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916104 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916107 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916110 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916112 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916115 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916118 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:50.916119 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916121 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916126 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916131 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916135 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916140 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916143 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916146 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916149 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916151 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916154 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916156 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916159 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916162 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916164 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916167 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916169 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916172 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916174 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916177 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916180 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:50.916389 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916183 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916186 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916188 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916190 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916193 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916195 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916197 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916200 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916202 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916206 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916210 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916214 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916221 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916226 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916230 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916234 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916239 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916242 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916245 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916248 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:50.916945 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916251 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916254 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916257 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916260 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916262 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916265 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916267 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916270 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916273 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916276 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916278 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916281 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916283 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916287 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916292 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916296 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916300 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916305 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916308 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916311 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:50.917457 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916313 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916316 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916318 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916320 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916323 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916325 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916328 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916331 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916333 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916336 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916339 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916341 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916344 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916346 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916349 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:50.917975 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.916354 2543 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916469 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916475 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916479 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916483 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916486 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916489 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916493 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916496 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916498 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916501 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916505 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916508 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916511 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916513 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916516 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916519 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916521 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916524 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:50.918346 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916526 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916529 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916531 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916536 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916541 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916561 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916566 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916569 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916572 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916574 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916577 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916579 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916582 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916585 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916587 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916589 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916592 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916594 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916596 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916599 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:50.918829 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916601 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916604 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916606 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916608 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916611 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916615 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916618 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916620 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916623 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916626 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916628 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916632 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916636 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916641 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916645 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916649 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916652 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916655 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916657 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916659 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:50.919325 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916662 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916664 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916666 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916669 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916672 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916674 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916676 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916679 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916681 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916684 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916686 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916689 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916691 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916694 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916696 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916699 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916701 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916704 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916707 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916709 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:50.919850 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916713 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916717 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916721 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916725 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916729 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916733 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916736 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:50.916738 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.916743 2543 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.917561 2543 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 10:03:50.920335 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.919732 2543 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 10:03:50.920646 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.920634 2543 server.go:1019] "Starting client certificate rotation" Apr 21 10:03:50.920750 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.920734 2543 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:50.920779 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.920775 2543 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:50.943953 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.943693 2543 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:50.949481 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.949458 2543 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:50.967379 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.967359 2543 log.go:25] "Validated CRI v1 runtime API" Apr 21 10:03:50.972900 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.972886 2543 log.go:25] "Validated CRI v1 image API" Apr 21 10:03:50.974254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.974231 2543 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 10:03:50.977988 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.977962 2543 fs.go:135] Filesystem UUIDs: map[5a04215b-e1c4-4ae0-8d46-ee2addbec9f2:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c5fa9ec8-3edd-46a1-9709-f35ad78c4796:/dev/nvme0n1p3] Apr 21 10:03:50.978063 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.977988 2543 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 10:03:50.982000 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.981980 2543 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:50.983669 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.983560 2543 manager.go:217] Machine: {Timestamp:2026-04-21 10:03:50.981866246 +0000 UTC m=+0.384790194 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100038 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2780df505cee7f9390648e634d4865 SystemUUID:ec2780df-505c-ee7f-9390-648e634d4865 BootID:7834229c-4f0f-47d6-97db-1014852385a8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1c:71:8c:49:0b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1c:71:8c:49:0b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:b7:eb:88:62:cd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 10:03:50.983669 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.983661 2543 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 10:03:50.983799 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.983784 2543 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 10:03:50.984621 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.984598 2543 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 10:03:50.984754 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.984624 2543 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 10:03:50.984802 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.984764 2543 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 10:03:50.984802 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.984772 2543 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 10:03:50.984802 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.984785 2543 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:50.985579 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.985569 2543 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:50.986762 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.986751 2543 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:50.986869 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.986860 2543 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 10:03:50.988824 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.988814 2543 kubelet.go:491] "Attempting to sync node with API server" Apr 21 10:03:50.988862 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.988832 2543 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 10:03:50.988862 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.988846 2543 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 10:03:50.988862 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.988856 2543 kubelet.go:397] "Adding apiserver pod source" Apr 21 10:03:50.988994 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.988866 2543 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 10:03:50.989876 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.989860 2543 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:50.989947 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.989880 2543 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:50.993199 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.993178 2543 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 10:03:50.995068 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.995051 2543 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 10:03:50.996906 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996894 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 10:03:50.996950 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996911 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 10:03:50.996950 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996917 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 10:03:50.996950 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996922 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 10:03:50.996950 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996927 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 10:03:50.996950 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996933 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 10:03:50.996950 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996939 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 10:03:50.996950 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996944 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 10:03:50.996950 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996952 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 10:03:50.997152 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996957 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 10:03:50.997152 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996966 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 10:03:50.997152 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.996974 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 10:03:50.997749 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.997736 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 10:03:50.997824 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:50.997752 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 10:03:51.001590 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.001577 2543 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 10:03:51.001634 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.001612 2543 server.go:1295] "Started kubelet" Apr 21 10:03:51.001746 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.001705 2543 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 10:03:51.001842 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.001717 2543 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 10:03:51.001842 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.001781 2543 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 10:03:51.002583 ip-10-0-140-144 systemd[1]: Started Kubernetes Kubelet. Apr 21 10:03:51.003282 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.003151 2543 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 10:03:51.003487 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.003332 2543 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 10:03:51.003487 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.003424 2543 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 10:03:51.003487 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.003438 2543 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 10:03:51.003761 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.003749 2543 server.go:317] "Adding debug handlers to kubelet server" Apr 21 10:03:51.008829 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.007844 2543 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-144.ec2.internal.18a8571c0de6c5ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-144.ec2.internal,UID:ip-10-0-140-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-144.ec2.internal,},FirstTimestamp:2026-04-21 10:03:51.001589166 +0000 UTC m=+0.404513114,LastTimestamp:2026-04-21 10:03:51.001589166 +0000 UTC m=+0.404513114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-144.ec2.internal,}" Apr 21 10:03:51.009570 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.009532 2543 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8r9vs" Apr 21 10:03:51.009988 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.009969 2543 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:51.010086 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.010073 2543 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 10:03:51.010511 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.010495 2543 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 10:03:51.011097 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011080 2543 factory.go:55] Registering systemd factory Apr 21 10:03:51.011186 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011103 2543 factory.go:223] Registration of the systemd container factory successfully Apr 21 10:03:51.011186 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011155 2543 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 10:03:51.011186 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011167 2543 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 10:03:51.011186 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011184 2543 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 10:03:51.011375 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011319 2543 reconstruct.go:97] "Volume reconstruction finished" Apr 21 10:03:51.011375 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011329 2543 reconciler.go:26] "Reconciler: start to sync state" Apr 21 10:03:51.011375 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011331 2543 factory.go:153] Registering CRI-O factory Apr 21 10:03:51.011375 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011361 2543 factory.go:223] Registration of the crio container factory successfully Apr 21 10:03:51.011566 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011430 2543 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 10:03:51.011566 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011454 2543 factory.go:103] Registering Raw factory Apr 21 10:03:51.011566 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.011336 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.011566 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011470 2543 manager.go:1196] Started watching for new ooms in manager Apr 21 10:03:51.011967 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.011953 2543 manager.go:319] Starting recovery of all containers Apr 21 10:03:51.018629 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.014774 2543 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 10:03:51.018629 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.014868 2543 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 10:03:51.019003 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.018671 2543 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8r9vs" Apr 21 10:03:51.022532 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.022513 2543 manager.go:324] Recovery completed Apr 21 10:03:51.027130 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.027115 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:51.031807 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.031790 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:51.031859 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.031822 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:51.031859 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.031835 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:51.032233 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.032220 2543 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 10:03:51.032295 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.032234 2543 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 10:03:51.032295 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.032253 2543 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:51.034250 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.034184 2543 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-144.ec2.internal.18a8571c0fb3e233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-144.ec2.internal,UID:ip-10-0-140-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-144.ec2.internal,},FirstTimestamp:2026-04-21 10:03:51.031808563 +0000 UTC m=+0.434732512,LastTimestamp:2026-04-21 10:03:51.031808563 +0000 UTC m=+0.434732512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-144.ec2.internal,}" Apr 21 10:03:51.035014 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.035001 2543 policy_none.go:49] "None policy: Start" Apr 21 10:03:51.035084 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.035027 2543 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 10:03:51.035084 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.035040 2543 state_mem.go:35] "Initializing new in-memory state store" Apr 21 10:03:51.068219 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.068203 2543 manager.go:341] "Starting Device Plugin manager" Apr 21 10:03:51.068327 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.068257 2543 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 10:03:51.068327 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.068276 2543 server.go:85] "Starting device plugin registration server" Apr 21 10:03:51.068509 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.068497 2543 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 10:03:51.068705 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.068513 2543 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 10:03:51.068705 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.068676 2543 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 10:03:51.068890 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.068752 2543 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 10:03:51.068890 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.068760 2543 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 10:03:51.069518 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.069434 2543 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 10:03:51.069518 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.069474 2543 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.145031 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.144994 2543 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 10:03:51.146356 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.146337 2543 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 10:03:51.146467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.146362 2543 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 10:03:51.146467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.146392 2543 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 10:03:51.146467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.146402 2543 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 10:03:51.146593 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.146478 2543 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 10:03:51.148801 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.148780 2543 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:51.168824 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.168765 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:51.169581 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.169563 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:51.169667 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.169601 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:51.169667 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.169616 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:51.169667 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.169649 2543 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.175227 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.175212 2543 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.175285 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.175235 2543 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-144.ec2.internal\": node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.191100 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.191078 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.247014 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.246992 2543 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal"] Apr 21 10:03:51.247081 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.247054 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:51.248479 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.248465 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:51.248567 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.248493 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:51.248567 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.248506 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:51.249676 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.249664 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:51.249843 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.249827 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.249902 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.249863 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:51.251937 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.251922 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:51.252014 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.251945 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:51.252014 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.251955 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:51.252014 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.251972 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:51.252014 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.251997 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:51.252014 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.252010 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:51.253489 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.253472 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.253576 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.253502 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:51.256013 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.255991 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:51.256089 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.256016 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:51.256089 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.256029 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:51.279210 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.279183 2543 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-144.ec2.internal\" not found" node="ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.283576 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.283561 2543 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-144.ec2.internal\" not found" node="ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.291596 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.291579 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.312514 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.312481 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/392595abe2998bff9f794a1adbea566a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"392595abe2998bff9f794a1adbea566a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.312514 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.312519 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/392595abe2998bff9f794a1adbea566a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"392595abe2998bff9f794a1adbea566a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.312648 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.312537 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc4de9b4ca4b46a01bc475a4f767a339-config\") pod \"kube-apiserver-proxy-ip-10-0-140-144.ec2.internal\" (UID: \"bc4de9b4ca4b46a01bc475a4f767a339\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.392521 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.392498 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.412820 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.412799 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/392595abe2998bff9f794a1adbea566a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"392595abe2998bff9f794a1adbea566a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.412899 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.412828 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/392595abe2998bff9f794a1adbea566a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"392595abe2998bff9f794a1adbea566a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.412899 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.412844 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc4de9b4ca4b46a01bc475a4f767a339-config\") pod \"kube-apiserver-proxy-ip-10-0-140-144.ec2.internal\" (UID: \"bc4de9b4ca4b46a01bc475a4f767a339\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.412899 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.412889 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc4de9b4ca4b46a01bc475a4f767a339-config\") pod \"kube-apiserver-proxy-ip-10-0-140-144.ec2.internal\" (UID: \"bc4de9b4ca4b46a01bc475a4f767a339\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.412998 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.412901 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/392595abe2998bff9f794a1adbea566a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"392595abe2998bff9f794a1adbea566a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.412998 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.412923 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/392595abe2998bff9f794a1adbea566a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"392595abe2998bff9f794a1adbea566a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.493239 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.493174 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.581697 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.581673 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.586596 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.586573 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 21 10:03:51.593740 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.593722 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.694332 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.694299 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.794797 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.794766 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.855050 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.855028 2543 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:51.895065 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.895040 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:51.920494 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.920464 2543 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 10:03:51.921135 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.920638 2543 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:51.921135 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:51.920637 2543 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:51.996005 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:51.995977 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:52.010279 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.010083 2543 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:52.022039 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.022001 2543 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 09:58:51 +0000 UTC" deadline="2027-12-26 04:59:28.437707921 +0000 UTC" Apr 21 10:03:52.022039 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.022033 2543 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14730h55m36.415677873s" Apr 21 10:03:52.025587 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.025566 2543 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:52.045115 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.045089 2543 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8xmdd" Apr 21 10:03:52.054287 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.054259 2543 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8xmdd" Apr 21 10:03:52.087753 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:52.087716 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4de9b4ca4b46a01bc475a4f767a339.slice/crio-18c4baaa69fe4465c27a04a1c750836bd60716c0c48f3cc62cb8ae1c2eeac9f9 WatchSource:0}: Error finding container 18c4baaa69fe4465c27a04a1c750836bd60716c0c48f3cc62cb8ae1c2eeac9f9: Status 404 returned error can't find the container with id 18c4baaa69fe4465c27a04a1c750836bd60716c0c48f3cc62cb8ae1c2eeac9f9 Apr 21 10:03:52.088361 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:52.088347 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392595abe2998bff9f794a1adbea566a.slice/crio-489b563cac864af1350ca70a201e8e37be171ef1cf953750aead2fb83d881eee WatchSource:0}: Error finding container 489b563cac864af1350ca70a201e8e37be171ef1cf953750aead2fb83d881eee: Status 404 returned error can't find the container with id 489b563cac864af1350ca70a201e8e37be171ef1cf953750aead2fb83d881eee Apr 21 10:03:52.092948 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.092935 2543 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:03:52.096595 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:52.096578 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:52.149200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.149159 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" event={"ID":"bc4de9b4ca4b46a01bc475a4f767a339","Type":"ContainerStarted","Data":"18c4baaa69fe4465c27a04a1c750836bd60716c0c48f3cc62cb8ae1c2eeac9f9"} Apr 21 10:03:52.150147 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.150126 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" event={"ID":"392595abe2998bff9f794a1adbea566a","Type":"ContainerStarted","Data":"489b563cac864af1350ca70a201e8e37be171ef1cf953750aead2fb83d881eee"} Apr 21 10:03:52.197302 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:52.197276 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:52.286495 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.286428 2543 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:52.297759 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:52.297741 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:52.398309 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:52.398279 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:52.499160 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:52.499127 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 21 10:03:52.524197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.524167 2543 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:52.610891 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.610814 2543 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 21 10:03:52.629333 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.629303 2543 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:52.630193 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.630179 2543 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 21 10:03:52.641415 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.641391 2543 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:52.956873 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.956803 2543 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:52.990163 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:52.990133 2543 apiserver.go:52] "Watching apiserver" Apr 21 10:03:53.002562 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.002518 2543 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 10:03:53.002866 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.002845 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h","openshift-cluster-node-tuning-operator/tuned-6xxzz","openshift-dns/node-resolver-8gn9x","openshift-image-registry/node-ca-mbz9p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal","openshift-multus/multus-additional-cni-plugins-tf4pw","openshift-multus/multus-jjplc","openshift-multus/network-metrics-daemon-7rjs4","kube-system/konnectivity-agent-w982h","kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal","openshift-network-diagnostics/network-check-target-5d6q8","openshift-network-operator/iptables-alerter-x99h9","openshift-ovn-kubernetes/ovnkube-node-hpb6h"] Apr 21 10:03:53.005036 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.005013 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.006418 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.006106 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.007745 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.007719 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.009070 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.009051 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.010265 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.010246 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:53.010347 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.010288 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7n9xf\"" Apr 21 10:03:53.010347 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.010311 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 10:03:53.010510 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.010490 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:53.011748 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.011730 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 10:03:53.011886 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.011777 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 10:03:53.011949 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.011911 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.012007 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.011954 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 10:03:53.012057 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.012011 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 10:03:53.012105 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.012061 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qrxk4\"" Apr 21 10:03:53.013228 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.012922 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-27kw6\"" Apr 21 10:03:53.013228 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.013152 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 10:03:53.013338 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.013318 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.013407 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.013389 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:53.013462 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.013446 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:03:53.014062 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.014046 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 10:03:53.014129 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.014060 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 10:03:53.014129 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.014084 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 10:03:53.014682 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.014664 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w982h" Apr 21 10:03:53.015089 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.015073 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dbqqd\"" Apr 21 10:03:53.015807 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.015788 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:03:53.015900 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.015854 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:03:53.017060 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.017045 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.018652 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.018633 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.021263 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gqrt7\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.021513 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.021558 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wwkdk\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.021809 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5274v\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.021877 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.021970 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022070 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022114 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022145 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-device-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022186 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-etc-selinux\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022221 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-var-lib-kubelet\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022253 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76addc89-f8bd-48f1-b215-22850711d8a8-serviceca\") pod \"node-ca-mbz9p\" (UID: \"76addc89-f8bd-48f1-b215-22850711d8a8\") " pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022273 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022278 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022364 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-cnibin\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022448 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tgn75\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022450 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-run-k8s-cni-cncf-io\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022477 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022498 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-run-netns\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022530 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvbl6\" (UniqueName: \"kubernetes.io/projected/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-kube-api-access-wvbl6\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022584 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 10:03:53.022582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022596 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-tuned\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022628 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-tmp\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022724 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022758 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-cnibin\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022805 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9dca93be-3209-499b-9769-928f3d717103-cni-binary-copy\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022850 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022888 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvprs\" (UniqueName: \"kubernetes.io/projected/9dca93be-3209-499b-9769-928f3d717103-kube-api-access-bvprs\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022931 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-cni-binary-copy\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022968 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-var-lib-cni-multus\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.022999 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-var-lib-kubelet\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023071 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-os-release\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023104 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-registration-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023123 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-sys-fs\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023143 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-daemon-config\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023160 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-modprobe-d\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023227 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-kubernetes\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023261 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-sysctl-d\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.023688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023292 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-system-cni-dir\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023323 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023362 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-socket-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023435 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-system-cni-dir\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023455 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-os-release\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023474 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-socket-dir-parent\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023492 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-etc-kubernetes\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023508 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d1b8400-a581-4d5c-8d5f-39cdbe726442-tmp-dir\") pod \"node-resolver-8gn9x\" (UID: \"8d1b8400-a581-4d5c-8d5f-39cdbe726442\") " pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023605 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27b6f\" (UniqueName: \"kubernetes.io/projected/80502d68-e1ff-4bf8-a351-5ec764c77341-kube-api-access-27b6f\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023664 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-hostroot\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023699 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-conf-dir\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023729 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-run\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023786 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-lib-modules\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023817 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm8s4\" (UniqueName: \"kubernetes.io/projected/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-kube-api-access-pm8s4\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023852 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvpc8\" (UniqueName: \"kubernetes.io/projected/8d1b8400-a581-4d5c-8d5f-39cdbe726442-kube-api-access-wvpc8\") pod \"node-resolver-8gn9x\" (UID: \"8d1b8400-a581-4d5c-8d5f-39cdbe726442\") " pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023882 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76addc89-f8bd-48f1-b215-22850711d8a8-host\") pod \"node-ca-mbz9p\" (UID: \"76addc89-f8bd-48f1-b215-22850711d8a8\") " pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.024516 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.023912 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9dca93be-3209-499b-9769-928f3d717103-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024039 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9dca93be-3209-499b-9769-928f3d717103-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024132 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-run-multus-certs\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024175 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-sysconfig\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024206 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-host\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024234 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8d1b8400-a581-4d5c-8d5f-39cdbe726442-hosts-file\") pod \"node-resolver-8gn9x\" (UID: \"8d1b8400-a581-4d5c-8d5f-39cdbe726442\") " pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024276 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qcdh\" (UniqueName: \"kubernetes.io/projected/76addc89-f8bd-48f1-b215-22850711d8a8-kube-api-access-4qcdh\") pod \"node-ca-mbz9p\" (UID: \"76addc89-f8bd-48f1-b215-22850711d8a8\") " pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024309 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-cni-dir\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024340 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-var-lib-cni-bin\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024370 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-sysctl-conf\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024411 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-systemd\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.025254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.024492 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-sys\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.026317 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.026008 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-988zs\"" Apr 21 10:03:53.026317 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.026055 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 10:03:53.026317 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.026131 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 10:03:53.026317 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.026059 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 10:03:53.026317 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.026180 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 10:03:53.026317 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.026226 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 10:03:53.026317 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.026236 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 10:03:53.055064 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.055033 2543 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:52 +0000 UTC" deadline="2027-11-01 03:27:42.875887731 +0000 UTC" Apr 21 10:03:53.055064 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.055058 2543 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13409h23m49.820832888s" Apr 21 10:03:53.112415 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.112388 2543 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 10:03:53.125087 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125058 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-conf-dir\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.125200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125101 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-run\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.125200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125138 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-systemd-units\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.125200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125165 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-run-ovn\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.125200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125174 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-conf-dir\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.125200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125190 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-var-lib-openvswitch\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.125402 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125220 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-var-lib-cni-bin\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.125402 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125248 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-run\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.125402 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125272 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-var-lib-cni-bin\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.125402 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125340 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-sysctl-conf\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.125402 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125370 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-systemd\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.125711 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125432 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e382252c-0c4b-41ed-b8ca-d5f651a559a8-iptables-alerter-script\") pod \"iptables-alerter-x99h9\" (UID: \"e382252c-0c4b-41ed-b8ca-d5f651a559a8\") " pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.125711 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125461 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-run-ovn-kubernetes\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.125711 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125488 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-device-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.125711 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125491 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-sysctl-conf\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.125711 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125532 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-systemd\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.125711 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125565 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/324b8f2e-48a1-43e9-b457-d61a6b4fe663-ovnkube-script-lib\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.125711 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125579 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-device-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.125711 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125633 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-cnibin\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.125711 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125690 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvbl6\" (UniqueName: \"kubernetes.io/projected/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-kube-api-access-wvbl6\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125728 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-tmp\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125757 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/687fb9cb-bcbd-4bee-961a-a42877702749-konnectivity-ca\") pod \"konnectivity-agent-w982h\" (UID: \"687fb9cb-bcbd-4bee-961a-a42877702749\") " pod="kube-system/konnectivity-agent-w982h" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125764 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-cnibin\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125785 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-etc-openvswitch\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125895 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9dca93be-3209-499b-9769-928f3d717103-cni-binary-copy\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125932 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-cni-binary-copy\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125950 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-kubernetes\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.125974 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm8s4\" (UniqueName: \"kubernetes.io/projected/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-kube-api-access-pm8s4\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126001 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-slash\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.126094 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126035 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-kubernetes\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.126467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126093 2543 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 10:03:53.126467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126152 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-cni-bin\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.126467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126206 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:03:53.126467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126279 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-os-release\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.126467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126304 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-registration-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.126467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126364 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-os-release\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.126467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126344 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-daemon-config\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.126467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126416 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-log-socket\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.126467 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126448 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/324b8f2e-48a1-43e9-b457-d61a6b4fe663-ovnkube-config\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.126744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126476 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:53.126744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126527 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.126744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126590 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-os-release\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.126744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126616 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d1b8400-a581-4d5c-8d5f-39cdbe726442-tmp-dir\") pod \"node-resolver-8gn9x\" (UID: \"8d1b8400-a581-4d5c-8d5f-39cdbe726442\") " pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.126744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126643 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-run-netns\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.126744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126648 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-cni-binary-copy\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.126744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126674 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r92bk\" (UniqueName: \"kubernetes.io/projected/324b8f2e-48a1-43e9-b457-d61a6b4fe663-kube-api-access-r92bk\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.126744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126712 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27b6f\" (UniqueName: \"kubernetes.io/projected/80502d68-e1ff-4bf8-a351-5ec764c77341-kube-api-access-27b6f\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.126744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126709 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-registration-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.126978 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126772 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-os-release\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.126978 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126790 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-hostroot\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.126978 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126806 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvpc8\" (UniqueName: \"kubernetes.io/projected/8d1b8400-a581-4d5c-8d5f-39cdbe726442-kube-api-access-wvpc8\") pod \"node-resolver-8gn9x\" (UID: \"8d1b8400-a581-4d5c-8d5f-39cdbe726442\") " pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.126978 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.126813 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.127086 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127001 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-daemon-config\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.127086 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127000 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d1b8400-a581-4d5c-8d5f-39cdbe726442-tmp-dir\") pod \"node-resolver-8gn9x\" (UID: \"8d1b8400-a581-4d5c-8d5f-39cdbe726442\") " pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.127170 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127079 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-hostroot\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.127244 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127202 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76addc89-f8bd-48f1-b215-22850711d8a8-host\") pod \"node-ca-mbz9p\" (UID: \"76addc89-f8bd-48f1-b215-22850711d8a8\") " pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.127244 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127224 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/324b8f2e-48a1-43e9-b457-d61a6b4fe663-env-overrides\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.127244 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127227 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76addc89-f8bd-48f1-b215-22850711d8a8-host\") pod \"node-ca-mbz9p\" (UID: \"76addc89-f8bd-48f1-b215-22850711d8a8\") " pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.127244 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127238 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/324b8f2e-48a1-43e9-b457-d61a6b4fe663-ovn-node-metrics-cert\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.127431 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127256 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9dca93be-3209-499b-9769-928f3d717103-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.127431 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127290 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9dca93be-3209-499b-9769-928f3d717103-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.127431 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127361 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-run-multus-certs\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.127431 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127393 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-sysconfig\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.127431 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127407 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-host\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.127431 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127427 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8d1b8400-a581-4d5c-8d5f-39cdbe726442-hosts-file\") pod \"node-resolver-8gn9x\" (UID: \"8d1b8400-a581-4d5c-8d5f-39cdbe726442\") " pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127447 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-run-multus-certs\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127451 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qcdh\" (UniqueName: \"kubernetes.io/projected/76addc89-f8bd-48f1-b215-22850711d8a8-kube-api-access-4qcdh\") pod \"node-ca-mbz9p\" (UID: \"76addc89-f8bd-48f1-b215-22850711d8a8\") " pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127489 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-host\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127448 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9dca93be-3209-499b-9769-928f3d717103-cni-binary-copy\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127490 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-cni-dir\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127540 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-cni-dir\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127566 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-sys\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127596 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/687fb9cb-bcbd-4bee-961a-a42877702749-agent-certs\") pod \"konnectivity-agent-w982h\" (UID: \"687fb9cb-bcbd-4bee-961a-a42877702749\") " pod="kube-system/konnectivity-agent-w982h" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127614 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-cni-netd\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127636 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-etc-selinux\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127602 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8d1b8400-a581-4d5c-8d5f-39cdbe726442-hosts-file\") pod \"node-resolver-8gn9x\" (UID: \"8d1b8400-a581-4d5c-8d5f-39cdbe726442\") " pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127673 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-var-lib-kubelet\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127685 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-sys\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127700 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76addc89-f8bd-48f1-b215-22850711d8a8-serviceca\") pod \"node-ca-mbz9p\" (UID: \"76addc89-f8bd-48f1-b215-22850711d8a8\") " pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.127727 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127722 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqx6x\" (UniqueName: \"kubernetes.io/projected/e8bb2bdc-f702-42cf-a999-1816acd364ba-kube-api-access-vqx6x\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127739 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-run-k8s-cni-cncf-io\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127748 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-var-lib-kubelet\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127754 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-run-netns\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127760 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-etc-selinux\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127784 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-run-netns\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127787 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-tuned\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127821 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-cnibin\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127847 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127850 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9dca93be-3209-499b-9769-928f3d717103-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127825 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-run-k8s-cni-cncf-io\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127873 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvprs\" (UniqueName: \"kubernetes.io/projected/9dca93be-3209-499b-9769-928f3d717103-kube-api-access-bvprs\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127914 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-cnibin\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127954 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-var-lib-cni-multus\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.127985 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-var-lib-kubelet\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128009 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-modprobe-d\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128033 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-lib-modules\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.128390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128053 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-sysconfig\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128076 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-sys-fs\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128124 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-sysctl-d\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128150 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76addc89-f8bd-48f1-b215-22850711d8a8-serviceca\") pod \"node-ca-mbz9p\" (UID: \"76addc89-f8bd-48f1-b215-22850711d8a8\") " pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128155 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lz6j\" (UniqueName: \"kubernetes.io/projected/e382252c-0c4b-41ed-b8ca-d5f651a559a8-kube-api-access-6lz6j\") pod \"iptables-alerter-x99h9\" (UID: \"e382252c-0c4b-41ed-b8ca-d5f651a559a8\") " pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128183 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-run-systemd\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128209 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-run-openvswitch\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128214 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-lib-modules\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128225 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-modprobe-d\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128236 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-node-log\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128242 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-var-lib-cni-multus\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128209 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-sys-fs\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128255 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-host-var-lib-kubelet\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128263 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128317 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-system-cni-dir\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128341 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-sysctl-d\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128371 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-socket-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.129034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128388 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9dca93be-3209-499b-9769-928f3d717103-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128400 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-system-cni-dir\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128373 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-system-cni-dir\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128418 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-socket-dir-parent\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128459 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-multus-socket-dir-parent\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128467 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80502d68-e1ff-4bf8-a351-5ec764c77341-socket-dir\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128467 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-etc-kubernetes\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128475 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-system-cni-dir\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128504 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e382252c-0c4b-41ed-b8ca-d5f651a559a8-host-slash\") pod \"iptables-alerter-x99h9\" (UID: \"e382252c-0c4b-41ed-b8ca-d5f651a559a8\") " pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128538 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-kubelet\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128569 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-etc-kubernetes\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.129503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.128632 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9dca93be-3209-499b-9769-928f3d717103-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.131022 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.130993 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-etc-tuned\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.131309 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.131275 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-tmp\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.139720 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.139106 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvbl6\" (UniqueName: \"kubernetes.io/projected/7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823-kube-api-access-wvbl6\") pod \"multus-jjplc\" (UID: \"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823\") " pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.140013 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.139987 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvprs\" (UniqueName: \"kubernetes.io/projected/9dca93be-3209-499b-9769-928f3d717103-kube-api-access-bvprs\") pod \"multus-additional-cni-plugins-tf4pw\" (UID: \"9dca93be-3209-499b-9769-928f3d717103\") " pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.141048 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.141021 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm8s4\" (UniqueName: \"kubernetes.io/projected/bd31fa67-8392-4eed-b4a6-760a1b7abbf7-kube-api-access-pm8s4\") pod \"tuned-6xxzz\" (UID: \"bd31fa67-8392-4eed-b4a6-760a1b7abbf7\") " pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.141602 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.141581 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qcdh\" (UniqueName: \"kubernetes.io/projected/76addc89-f8bd-48f1-b215-22850711d8a8-kube-api-access-4qcdh\") pod \"node-ca-mbz9p\" (UID: \"76addc89-f8bd-48f1-b215-22850711d8a8\") " pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.142035 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.142007 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvpc8\" (UniqueName: \"kubernetes.io/projected/8d1b8400-a581-4d5c-8d5f-39cdbe726442-kube-api-access-wvpc8\") pod \"node-resolver-8gn9x\" (UID: \"8d1b8400-a581-4d5c-8d5f-39cdbe726442\") " pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.142145 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.142122 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27b6f\" (UniqueName: \"kubernetes.io/projected/80502d68-e1ff-4bf8-a351-5ec764c77341-kube-api-access-27b6f\") pod \"aws-ebs-csi-driver-node-zkw7h\" (UID: \"80502d68-e1ff-4bf8-a351-5ec764c77341\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.229658 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229570 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-var-lib-openvswitch\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229658 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229618 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e382252c-0c4b-41ed-b8ca-d5f651a559a8-iptables-alerter-script\") pod \"iptables-alerter-x99h9\" (UID: \"e382252c-0c4b-41ed-b8ca-d5f651a559a8\") " pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.229658 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229623 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-var-lib-openvswitch\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229658 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229645 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-run-ovn-kubernetes\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229669 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/324b8f2e-48a1-43e9-b457-d61a6b4fe663-ovnkube-script-lib\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229695 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/687fb9cb-bcbd-4bee-961a-a42877702749-konnectivity-ca\") pod \"konnectivity-agent-w982h\" (UID: \"687fb9cb-bcbd-4bee-961a-a42877702749\") " pod="kube-system/konnectivity-agent-w982h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229715 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-etc-openvswitch\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229738 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-slash\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229781 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-slash\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229801 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-etc-openvswitch\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229818 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-run-ovn-kubernetes\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229842 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-cni-bin\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229876 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229879 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-cni-bin\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.229925 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229914 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-log-socket\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229942 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/324b8f2e-48a1-43e9-b457-d61a6b4fe663-ovnkube-config\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229966 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229975 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-log-socket\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.229998 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-run-netns\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230023 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r92bk\" (UniqueName: \"kubernetes.io/projected/324b8f2e-48a1-43e9-b457-d61a6b4fe663-kube-api-access-r92bk\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230053 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/324b8f2e-48a1-43e9-b457-d61a6b4fe663-env-overrides\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230077 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/324b8f2e-48a1-43e9-b457-d61a6b4fe663-ovn-node-metrics-cert\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230111 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/687fb9cb-bcbd-4bee-961a-a42877702749-agent-certs\") pod \"konnectivity-agent-w982h\" (UID: \"687fb9cb-bcbd-4bee-961a-a42877702749\") " pod="kube-system/konnectivity-agent-w982h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.230130 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230172 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-cni-netd\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.230227 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs podName:e8bb2bdc-f702-42cf-a999-1816acd364ba nodeName:}" failed. No retries permitted until 2026-04-21 10:03:53.730193235 +0000 UTC m=+3.133117203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs") pod "network-metrics-daemon-7rjs4" (UID: "e8bb2bdc-f702-42cf-a999-1816acd364ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230131 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-cni-netd\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230270 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqx6x\" (UniqueName: \"kubernetes.io/projected/e8bb2bdc-f702-42cf-a999-1816acd364ba-kube-api-access-vqx6x\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230282 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/687fb9cb-bcbd-4bee-961a-a42877702749-konnectivity-ca\") pod \"konnectivity-agent-w982h\" (UID: \"687fb9cb-bcbd-4bee-961a-a42877702749\") " pod="kube-system/konnectivity-agent-w982h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230316 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lz6j\" (UniqueName: \"kubernetes.io/projected/e382252c-0c4b-41ed-b8ca-d5f651a559a8-kube-api-access-6lz6j\") pod \"iptables-alerter-x99h9\" (UID: \"e382252c-0c4b-41ed-b8ca-d5f651a559a8\") " pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230331 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-run-netns\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.230364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230344 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-run-systemd\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230371 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-run-openvswitch\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230398 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-node-log\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230428 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230460 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e382252c-0c4b-41ed-b8ca-d5f651a559a8-host-slash\") pod \"iptables-alerter-x99h9\" (UID: \"e382252c-0c4b-41ed-b8ca-d5f651a559a8\") " pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230485 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-kubelet\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230510 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-systemd-units\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230532 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-run-ovn\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230574 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/324b8f2e-48a1-43e9-b457-d61a6b4fe663-env-overrides\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230316 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/324b8f2e-48a1-43e9-b457-d61a6b4fe663-ovnkube-script-lib\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230606 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-node-log\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230631 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-run-ovn\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230663 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-run-systemd\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230670 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230704 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-run-openvswitch\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230723 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e382252c-0c4b-41ed-b8ca-d5f651a559a8-host-slash\") pod \"iptables-alerter-x99h9\" (UID: \"e382252c-0c4b-41ed-b8ca-d5f651a559a8\") " pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230757 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-host-kubelet\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.231011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230762 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/324b8f2e-48a1-43e9-b457-d61a6b4fe663-systemd-units\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.232147 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.230797 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e382252c-0c4b-41ed-b8ca-d5f651a559a8-iptables-alerter-script\") pod \"iptables-alerter-x99h9\" (UID: \"e382252c-0c4b-41ed-b8ca-d5f651a559a8\") " pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.232147 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.231008 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/324b8f2e-48a1-43e9-b457-d61a6b4fe663-ovnkube-config\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.232746 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.232725 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/324b8f2e-48a1-43e9-b457-d61a6b4fe663-ovn-node-metrics-cert\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.233058 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.233041 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/687fb9cb-bcbd-4bee-961a-a42877702749-agent-certs\") pod \"konnectivity-agent-w982h\" (UID: \"687fb9cb-bcbd-4bee-961a-a42877702749\") " pod="kube-system/konnectivity-agent-w982h" Apr 21 10:03:53.239049 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.239027 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:53.239143 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.239055 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:53.239143 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.239071 2543 projected.go:194] Error preparing data for projected volume kube-api-access-g97q2 for pod openshift-network-diagnostics/network-check-target-5d6q8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:53.239143 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.239137 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2 podName:c91615b3-acf6-4090-ad49-f307699df3ec nodeName:}" failed. No retries permitted until 2026-04-21 10:03:53.73912108 +0000 UTC m=+3.142045030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-g97q2" (UniqueName: "kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2") pod "network-check-target-5d6q8" (UID: "c91615b3-acf6-4090-ad49-f307699df3ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:53.241746 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.241721 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r92bk\" (UniqueName: \"kubernetes.io/projected/324b8f2e-48a1-43e9-b457-d61a6b4fe663-kube-api-access-r92bk\") pod \"ovnkube-node-hpb6h\" (UID: \"324b8f2e-48a1-43e9-b457-d61a6b4fe663\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.241982 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.241963 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lz6j\" (UniqueName: \"kubernetes.io/projected/e382252c-0c4b-41ed-b8ca-d5f651a559a8-kube-api-access-6lz6j\") pod \"iptables-alerter-x99h9\" (UID: \"e382252c-0c4b-41ed-b8ca-d5f651a559a8\") " pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.242025 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.241985 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqx6x\" (UniqueName: \"kubernetes.io/projected/e8bb2bdc-f702-42cf-a999-1816acd364ba-kube-api-access-vqx6x\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:53.315990 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.315960 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jjplc" Apr 21 10:03:53.326430 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.326404 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" Apr 21 10:03:53.335157 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.335137 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8gn9x" Apr 21 10:03:53.340705 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.340684 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mbz9p" Apr 21 10:03:53.347266 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.347249 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" Apr 21 10:03:53.353787 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.353761 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" Apr 21 10:03:53.359303 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.359287 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w982h" Apr 21 10:03:53.365783 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.365765 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-x99h9" Apr 21 10:03:53.371328 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.371307 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:03:53.681488 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:53.681461 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687fb9cb_bcbd_4bee_961a_a42877702749.slice/crio-544101abdea1049e5327576a1c992b0501c1460f60a5d33e234ac78ab328868c WatchSource:0}: Error finding container 544101abdea1049e5327576a1c992b0501c1460f60a5d33e234ac78ab328868c: Status 404 returned error can't find the container with id 544101abdea1049e5327576a1c992b0501c1460f60a5d33e234ac78ab328868c Apr 21 10:03:53.682409 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:53.682272 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd31fa67_8392_4eed_b4a6_760a1b7abbf7.slice/crio-40a2db60afc66e30ee77314ea79764a1210512f341978ecb52b15e8c0b1c850f WatchSource:0}: Error finding container 40a2db60afc66e30ee77314ea79764a1210512f341978ecb52b15e8c0b1c850f: Status 404 returned error can't find the container with id 40a2db60afc66e30ee77314ea79764a1210512f341978ecb52b15e8c0b1c850f Apr 21 10:03:53.686108 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:53.686058 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod324b8f2e_48a1_43e9_b457_d61a6b4fe663.slice/crio-54a0d51987f0cb9e512540da0b8367075716345077ba9b1552db43d63e03ef1a WatchSource:0}: Error finding container 54a0d51987f0cb9e512540da0b8367075716345077ba9b1552db43d63e03ef1a: Status 404 returned error can't find the container with id 54a0d51987f0cb9e512540da0b8367075716345077ba9b1552db43d63e03ef1a Apr 21 10:03:53.687430 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:53.687148 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d1b8400_a581_4d5c_8d5f_39cdbe726442.slice/crio-0190bb3f2fdfcf258ec8d38c4beba67cfaa1ff3320ebbceed41a21ed26a8c599 WatchSource:0}: Error finding container 0190bb3f2fdfcf258ec8d38c4beba67cfaa1ff3320ebbceed41a21ed26a8c599: Status 404 returned error can't find the container with id 0190bb3f2fdfcf258ec8d38c4beba67cfaa1ff3320ebbceed41a21ed26a8c599 Apr 21 10:03:53.689799 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:53.689778 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e9f91d5_c2d9_42cb_b675_fa8a2b0ea823.slice/crio-abd208ccd666d28fc776fea452a2f618f845c7bff8e4300cececa18cfb97dab4 WatchSource:0}: Error finding container abd208ccd666d28fc776fea452a2f618f845c7bff8e4300cececa18cfb97dab4: Status 404 returned error can't find the container with id abd208ccd666d28fc776fea452a2f618f845c7bff8e4300cececa18cfb97dab4 Apr 21 10:03:53.691556 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:53.691449 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76addc89_f8bd_48f1_b215_22850711d8a8.slice/crio-0567dbc177ea31e4a82d282df7590402216c6c5ffe9cb4db86cc3fb68094f068 WatchSource:0}: Error finding container 0567dbc177ea31e4a82d282df7590402216c6c5ffe9cb4db86cc3fb68094f068: Status 404 returned error can't find the container with id 0567dbc177ea31e4a82d282df7590402216c6c5ffe9cb4db86cc3fb68094f068 Apr 21 10:03:53.692276 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:53.692254 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dca93be_3209_499b_9769_928f3d717103.slice/crio-393755870e50763469e26e0447b0c1c767706fc7c2338603ea119eb61a6ff563 WatchSource:0}: Error finding container 393755870e50763469e26e0447b0c1c767706fc7c2338603ea119eb61a6ff563: Status 404 returned error can't find the container with id 393755870e50763469e26e0447b0c1c767706fc7c2338603ea119eb61a6ff563 Apr 21 10:03:53.693282 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:53.693243 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80502d68_e1ff_4bf8_a351_5ec764c77341.slice/crio-4ecaccd44d327f58dd49ca0c1626604c100cd797756b68b5cacbccd4d24d23d4 WatchSource:0}: Error finding container 4ecaccd44d327f58dd49ca0c1626604c100cd797756b68b5cacbccd4d24d23d4: Status 404 returned error can't find the container with id 4ecaccd44d327f58dd49ca0c1626604c100cd797756b68b5cacbccd4d24d23d4 Apr 21 10:03:53.694235 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:03:53.694205 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode382252c_0c4b_41ed_b8ca_d5f651a559a8.slice/crio-b0c8c012b43b301f24673672e11264e922b523fc01e42d93dce6228d8e137fe3 WatchSource:0}: Error finding container b0c8c012b43b301f24673672e11264e922b523fc01e42d93dce6228d8e137fe3: Status 404 returned error can't find the container with id b0c8c012b43b301f24673672e11264e922b523fc01e42d93dce6228d8e137fe3 Apr 21 10:03:53.733262 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.733239 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:53.733361 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.733349 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:53.733413 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.733398 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs podName:e8bb2bdc-f702-42cf-a999-1816acd364ba nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.733381658 +0000 UTC m=+4.136305594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs") pod "network-metrics-daemon-7rjs4" (UID: "e8bb2bdc-f702-42cf-a999-1816acd364ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:53.773238 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.773216 2543 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:53.833692 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:53.833667 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:03:53.833828 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.833791 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:53.833828 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.833805 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:53.833828 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.833813 2543 projected.go:194] Error preparing data for projected volume kube-api-access-g97q2 for pod openshift-network-diagnostics/network-check-target-5d6q8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:53.833928 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:53.833862 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2 podName:c91615b3-acf6-4090-ad49-f307699df3ec nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.833849523 +0000 UTC m=+4.236773460 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-g97q2" (UniqueName: "kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2") pod "network-check-target-5d6q8" (UID: "c91615b3-acf6-4090-ad49-f307699df3ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.055633 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.055500 2543 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:52 +0000 UTC" deadline="2027-10-11 21:47:23.451962059 +0000 UTC" Apr 21 10:03:54.055633 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.055563 2543 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12923h43m29.396421618s" Apr 21 10:03:54.164524 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.164459 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" event={"ID":"bd31fa67-8392-4eed-b4a6-760a1b7abbf7","Type":"ContainerStarted","Data":"40a2db60afc66e30ee77314ea79764a1210512f341978ecb52b15e8c0b1c850f"} Apr 21 10:03:54.172064 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.172007 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w982h" event={"ID":"687fb9cb-bcbd-4bee-961a-a42877702749","Type":"ContainerStarted","Data":"544101abdea1049e5327576a1c992b0501c1460f60a5d33e234ac78ab328868c"} Apr 21 10:03:54.181132 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.181081 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" event={"ID":"80502d68-e1ff-4bf8-a351-5ec764c77341","Type":"ContainerStarted","Data":"4ecaccd44d327f58dd49ca0c1626604c100cd797756b68b5cacbccd4d24d23d4"} Apr 21 10:03:54.182571 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.182501 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" event={"ID":"9dca93be-3209-499b-9769-928f3d717103","Type":"ContainerStarted","Data":"393755870e50763469e26e0447b0c1c767706fc7c2338603ea119eb61a6ff563"} Apr 21 10:03:54.184689 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.184640 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8gn9x" event={"ID":"8d1b8400-a581-4d5c-8d5f-39cdbe726442","Type":"ContainerStarted","Data":"0190bb3f2fdfcf258ec8d38c4beba67cfaa1ff3320ebbceed41a21ed26a8c599"} Apr 21 10:03:54.192749 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.192722 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" event={"ID":"324b8f2e-48a1-43e9-b457-d61a6b4fe663","Type":"ContainerStarted","Data":"54a0d51987f0cb9e512540da0b8367075716345077ba9b1552db43d63e03ef1a"} Apr 21 10:03:54.205648 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.204796 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" event={"ID":"bc4de9b4ca4b46a01bc475a4f767a339","Type":"ContainerStarted","Data":"ef4ee3502ac493303f2b4ad4129cbb71e0c2d9a2037b0330d1c51c83843af4cc"} Apr 21 10:03:54.208751 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.208680 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-x99h9" event={"ID":"e382252c-0c4b-41ed-b8ca-d5f651a559a8","Type":"ContainerStarted","Data":"b0c8c012b43b301f24673672e11264e922b523fc01e42d93dce6228d8e137fe3"} Apr 21 10:03:54.219971 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.219900 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mbz9p" event={"ID":"76addc89-f8bd-48f1-b215-22850711d8a8","Type":"ContainerStarted","Data":"0567dbc177ea31e4a82d282df7590402216c6c5ffe9cb4db86cc3fb68094f068"} Apr 21 10:03:54.228623 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.228207 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jjplc" event={"ID":"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823","Type":"ContainerStarted","Data":"abd208ccd666d28fc776fea452a2f618f845c7bff8e4300cececa18cfb97dab4"} Apr 21 10:03:54.741728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.741563 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:54.741728 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:54.741706 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.741953 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:54.741774 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs podName:e8bb2bdc-f702-42cf-a999-1816acd364ba nodeName:}" failed. No retries permitted until 2026-04-21 10:03:56.741755864 +0000 UTC m=+6.144679799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs") pod "network-metrics-daemon-7rjs4" (UID: "e8bb2bdc-f702-42cf-a999-1816acd364ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.842219 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:54.842178 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:03:54.842395 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:54.842354 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:54.842395 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:54.842372 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:54.842395 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:54.842385 2543 projected.go:194] Error preparing data for projected volume kube-api-access-g97q2 for pod openshift-network-diagnostics/network-check-target-5d6q8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.842567 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:54.842442 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2 podName:c91615b3-acf6-4090-ad49-f307699df3ec nodeName:}" failed. No retries permitted until 2026-04-21 10:03:56.842422885 +0000 UTC m=+6.245346820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-g97q2" (UniqueName: "kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2") pod "network-check-target-5d6q8" (UID: "c91615b3-acf6-4090-ad49-f307699df3ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:55.147775 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:55.147747 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:55.148163 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:55.147890 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:03:55.148163 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:55.147908 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:03:55.148163 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:55.148008 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:03:55.252891 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:55.252762 2543 generic.go:358] "Generic (PLEG): container finished" podID="392595abe2998bff9f794a1adbea566a" containerID="63e61bcfe49bd85272513c74a0c7c1b2ec83cef83127fdda166c1416e5f9c560" exitCode=0 Apr 21 10:03:55.253300 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:55.253273 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" event={"ID":"392595abe2998bff9f794a1adbea566a","Type":"ContainerDied","Data":"63e61bcfe49bd85272513c74a0c7c1b2ec83cef83127fdda166c1416e5f9c560"} Apr 21 10:03:55.266859 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:55.266804 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" podStartSLOduration=3.266786185 podStartE2EDuration="3.266786185s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:54.230206376 +0000 UTC m=+3.633130333" watchObservedRunningTime="2026-04-21 10:03:55.266786185 +0000 UTC m=+4.669710147" Apr 21 10:03:56.261046 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.260797 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" event={"ID":"392595abe2998bff9f794a1adbea566a","Type":"ContainerStarted","Data":"2cde5581a0812030b8f8e8c27e6edf65e60952440e5fe38cc58d592834c3b7e4"} Apr 21 10:03:56.391745 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.391686 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" podStartSLOduration=4.391663954 podStartE2EDuration="4.391663954s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:56.288654831 +0000 UTC m=+5.691578788" watchObservedRunningTime="2026-04-21 10:03:56.391663954 +0000 UTC m=+5.794587913" Apr 21 10:03:56.392131 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.392113 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lnpnt"] Apr 21 10:03:56.395095 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.395033 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:56.395216 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:56.395130 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:03:56.454581 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.454494 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/96143cfb-12f0-4d9d-bf83-e55429b937c9-dbus\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:56.454581 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.454538 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:56.454781 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.454593 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/96143cfb-12f0-4d9d-bf83-e55429b937c9-kubelet-config\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:56.555949 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.555912 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/96143cfb-12f0-4d9d-bf83-e55429b937c9-dbus\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:56.556112 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.555962 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:56.556112 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.556001 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/96143cfb-12f0-4d9d-bf83-e55429b937c9-kubelet-config\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:56.556112 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.556105 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/96143cfb-12f0-4d9d-bf83-e55429b937c9-kubelet-config\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:56.556277 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.556254 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/96143cfb-12f0-4d9d-bf83-e55429b937c9-dbus\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:56.556385 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:56.556364 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:56.556442 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:56.556436 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret podName:96143cfb-12f0-4d9d-bf83-e55429b937c9 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:57.056416446 +0000 UTC m=+6.459340382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret") pod "global-pull-secret-syncer-lnpnt" (UID: "96143cfb-12f0-4d9d-bf83-e55429b937c9") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:56.760320 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.759818 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:56.760320 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:56.759936 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:56.760320 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:56.759985 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs podName:e8bb2bdc-f702-42cf-a999-1816acd364ba nodeName:}" failed. No retries permitted until 2026-04-21 10:04:00.759971004 +0000 UTC m=+10.162894940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs") pod "network-metrics-daemon-7rjs4" (UID: "e8bb2bdc-f702-42cf-a999-1816acd364ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:56.861123 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:56.861086 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:03:56.861355 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:56.861314 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:56.861355 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:56.861339 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:56.861355 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:56.861353 2543 projected.go:194] Error preparing data for projected volume kube-api-access-g97q2 for pod openshift-network-diagnostics/network-check-target-5d6q8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:56.861530 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:56.861425 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2 podName:c91615b3-acf6-4090-ad49-f307699df3ec nodeName:}" failed. No retries permitted until 2026-04-21 10:04:00.861404699 +0000 UTC m=+10.264328648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-g97q2" (UniqueName: "kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2") pod "network-check-target-5d6q8" (UID: "c91615b3-acf6-4090-ad49-f307699df3ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:57.062809 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:57.062731 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:57.062997 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:57.062974 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:57.063073 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:57.063054 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret podName:96143cfb-12f0-4d9d-bf83-e55429b937c9 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:58.063032936 +0000 UTC m=+7.465956879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret") pod "global-pull-secret-syncer-lnpnt" (UID: "96143cfb-12f0-4d9d-bf83-e55429b937c9") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:57.147300 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:57.146889 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:03:57.147300 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:57.147006 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:03:57.147300 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:57.146889 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:57.147300 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:57.147224 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:03:58.071446 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:58.071405 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:58.072029 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:58.071572 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:58.072029 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:58.071644 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret podName:96143cfb-12f0-4d9d-bf83-e55429b937c9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:00.07162389 +0000 UTC m=+9.474547836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret") pod "global-pull-secret-syncer-lnpnt" (UID: "96143cfb-12f0-4d9d-bf83-e55429b937c9") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:58.147381 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:58.147346 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:03:58.147530 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:58.147478 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:03:59.147120 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:59.147088 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:03:59.147602 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:59.147218 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:03:59.147602 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:03:59.147567 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:03:59.147749 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:03:59.147691 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:00.087936 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:00.087821 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:00.088133 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:00.088012 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:00.088133 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:00.088095 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret podName:96143cfb-12f0-4d9d-bf83-e55429b937c9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:04.088074443 +0000 UTC m=+13.490998385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret") pod "global-pull-secret-syncer-lnpnt" (UID: "96143cfb-12f0-4d9d-bf83-e55429b937c9") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:00.147175 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:00.147144 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:00.147699 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:00.147261 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:00.794811 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:00.794768 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:00.794983 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:00.794949 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:00.795039 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:00.795000 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs podName:e8bb2bdc-f702-42cf-a999-1816acd364ba nodeName:}" failed. No retries permitted until 2026-04-21 10:04:08.794986719 +0000 UTC m=+18.197910654 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs") pod "network-metrics-daemon-7rjs4" (UID: "e8bb2bdc-f702-42cf-a999-1816acd364ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:00.895363 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:00.895228 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:00.895566 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:00.895419 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:00.895566 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:00.895443 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:00.895566 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:00.895456 2543 projected.go:194] Error preparing data for projected volume kube-api-access-g97q2 for pod openshift-network-diagnostics/network-check-target-5d6q8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:00.895566 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:00.895507 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2 podName:c91615b3-acf6-4090-ad49-f307699df3ec nodeName:}" failed. No retries permitted until 2026-04-21 10:04:08.895494454 +0000 UTC m=+18.298418389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-g97q2" (UniqueName: "kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2") pod "network-check-target-5d6q8" (UID: "c91615b3-acf6-4090-ad49-f307699df3ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:01.148629 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:01.148535 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:01.148964 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:01.148681 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:01.149270 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:01.149117 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:01.149270 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:01.149226 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:02.146726 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:02.146693 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:02.146966 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:02.146816 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:03.147653 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:03.147617 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:03.148066 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:03.147626 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:03.148066 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:03.147724 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:03.148066 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:03.147785 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:04.119297 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:04.119253 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:04.119473 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:04.119394 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:04.119519 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:04.119480 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret podName:96143cfb-12f0-4d9d-bf83-e55429b937c9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:12.119458086 +0000 UTC m=+21.522382033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret") pod "global-pull-secret-syncer-lnpnt" (UID: "96143cfb-12f0-4d9d-bf83-e55429b937c9") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:04.146879 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:04.146847 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:04.147062 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:04.146986 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:05.147478 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:05.147436 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:05.147949 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:05.147437 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:05.147949 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:05.147587 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:05.147949 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:05.147648 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:06.146896 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:06.146861 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:06.147051 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:06.147023 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:07.146935 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:07.146852 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:07.147464 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:07.146982 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:07.147464 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:07.147050 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:07.147464 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:07.147137 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:08.146845 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:08.146813 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:08.147050 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:08.146922 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:08.854506 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:08.854477 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:08.854715 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:08.854642 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:08.854715 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:08.854704 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs podName:e8bb2bdc-f702-42cf-a999-1816acd364ba nodeName:}" failed. No retries permitted until 2026-04-21 10:04:24.85468513 +0000 UTC m=+34.257609071 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs") pod "network-metrics-daemon-7rjs4" (UID: "e8bb2bdc-f702-42cf-a999-1816acd364ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:08.955824 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:08.955785 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:08.956003 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:08.955969 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:08.956003 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:08.955989 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:08.956003 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:08.956003 2543 projected.go:194] Error preparing data for projected volume kube-api-access-g97q2 for pod openshift-network-diagnostics/network-check-target-5d6q8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:08.956160 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:08.956057 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2 podName:c91615b3-acf6-4090-ad49-f307699df3ec nodeName:}" failed. No retries permitted until 2026-04-21 10:04:24.956043149 +0000 UTC m=+34.358967084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-g97q2" (UniqueName: "kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2") pod "network-check-target-5d6q8" (UID: "c91615b3-acf6-4090-ad49-f307699df3ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:09.146753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:09.146666 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:09.146753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:09.146685 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:09.147026 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:09.146815 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:09.147026 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:09.146898 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:10.147142 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:10.147101 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:10.147622 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:10.147240 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:11.147420 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:11.147385 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:11.148054 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:11.147470 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:11.148054 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:11.147570 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:11.148054 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:11.147666 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:12.146865 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.146702 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:12.147002 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:12.146937 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:12.181811 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.181752 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:12.182297 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:12.181846 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:12.182297 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:12.181890 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret podName:96143cfb-12f0-4d9d-bf83-e55429b937c9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:28.18187635 +0000 UTC m=+37.584800284 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret") pod "global-pull-secret-syncer-lnpnt" (UID: "96143cfb-12f0-4d9d-bf83-e55429b937c9") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:12.288564 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.288511 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mbz9p" event={"ID":"76addc89-f8bd-48f1-b215-22850711d8a8","Type":"ContainerStarted","Data":"484f35cebf7cfa69699d1fd557215ae6e0e60c64117384519b01792ed2ffb7bf"} Apr 21 10:04:12.290373 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.290336 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jjplc" event={"ID":"7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823","Type":"ContainerStarted","Data":"8361b96dfeac3ee623fed51d130228d64bb8bdcf1e46275cb38aadd3979056f3"} Apr 21 10:04:12.291918 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.291893 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" event={"ID":"bd31fa67-8392-4eed-b4a6-760a1b7abbf7","Type":"ContainerStarted","Data":"9a88fc4b8ceebd3273b49e971f0dc756f4af60be51329ed0fc461cc6bd29691b"} Apr 21 10:04:12.293533 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.293514 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w982h" event={"ID":"687fb9cb-bcbd-4bee-961a-a42877702749","Type":"ContainerStarted","Data":"89f348bff7c6db023490744c9e61bca0747327cebe2ceec072bf9beb8fcf5a12"} Apr 21 10:04:12.295025 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.294997 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" event={"ID":"80502d68-e1ff-4bf8-a351-5ec764c77341","Type":"ContainerStarted","Data":"a2754b78ab544c381d4ced3311abb7e7a49d94ac66fb08878cd580796ec0a981"} Apr 21 10:04:12.296375 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.296350 2543 generic.go:358] "Generic (PLEG): container finished" podID="9dca93be-3209-499b-9769-928f3d717103" containerID="c550b1e60ac21d44c85edfb09f40dd1795cb4ecc8ef690d763dfd81c2363f9dc" exitCode=0 Apr 21 10:04:12.296468 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.296389 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" event={"ID":"9dca93be-3209-499b-9769-928f3d717103","Type":"ContainerDied","Data":"c550b1e60ac21d44c85edfb09f40dd1795cb4ecc8ef690d763dfd81c2363f9dc"} Apr 21 10:04:12.298129 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.298106 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8gn9x" event={"ID":"8d1b8400-a581-4d5c-8d5f-39cdbe726442","Type":"ContainerStarted","Data":"e32981147dca2c79fa2e78e2874e818d534fb71469be4020969df612c9c66246"} Apr 21 10:04:12.300814 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.300793 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" event={"ID":"324b8f2e-48a1-43e9-b457-d61a6b4fe663","Type":"ContainerStarted","Data":"70549043c3a1e53adec54629015781a10f759038e5cb43a809598ae5d582bb64"} Apr 21 10:04:12.300915 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.300818 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" event={"ID":"324b8f2e-48a1-43e9-b457-d61a6b4fe663","Type":"ContainerStarted","Data":"5819dd465b04cde817d097b4be3f57509765629b9717134864907a462fae4f25"} Apr 21 10:04:12.300915 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.300827 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" event={"ID":"324b8f2e-48a1-43e9-b457-d61a6b4fe663","Type":"ContainerStarted","Data":"9e7221e5f0b2b3f4e89cc70223f06ccaa4190ebccc1bb399d3e7ba04dd74739e"} Apr 21 10:04:12.300915 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.300836 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" event={"ID":"324b8f2e-48a1-43e9-b457-d61a6b4fe663","Type":"ContainerStarted","Data":"7a54c76b126c5fcf7e3de2e7406c33d06a96846878f74b987611d4010afe543d"} Apr 21 10:04:12.300915 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.300843 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" event={"ID":"324b8f2e-48a1-43e9-b457-d61a6b4fe663","Type":"ContainerStarted","Data":"0c2f6e7818407ad4c2e4ac21392867cb37ec0bb8dd2472ee0cb0a25f55583055"} Apr 21 10:04:12.300915 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.300851 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" event={"ID":"324b8f2e-48a1-43e9-b457-d61a6b4fe663","Type":"ContainerStarted","Data":"34aff639bca49d56bb67926e4e53a221e092bab8b924292963912eb3a18b5728"} Apr 21 10:04:12.303015 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.302979 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mbz9p" podStartSLOduration=3.660690045 podStartE2EDuration="21.302967067s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:03:53.693215941 +0000 UTC m=+3.096139889" lastFinishedPulling="2026-04-21 10:04:11.335492963 +0000 UTC m=+20.738416911" observedRunningTime="2026-04-21 10:04:12.302624202 +0000 UTC m=+21.705548171" watchObservedRunningTime="2026-04-21 10:04:12.302967067 +0000 UTC m=+21.705891026" Apr 21 10:04:12.318239 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.318195 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6xxzz" podStartSLOduration=3.666817396 podStartE2EDuration="21.318181748s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:03:53.684534385 +0000 UTC m=+3.087458322" lastFinishedPulling="2026-04-21 10:04:11.335898733 +0000 UTC m=+20.738822674" observedRunningTime="2026-04-21 10:04:12.317952805 +0000 UTC m=+21.720876761" watchObservedRunningTime="2026-04-21 10:04:12.318181748 +0000 UTC m=+21.721105706" Apr 21 10:04:12.352826 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.352782 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jjplc" podStartSLOduration=3.675650605 podStartE2EDuration="21.352769597s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:03:53.691253762 +0000 UTC m=+3.094177696" lastFinishedPulling="2026-04-21 10:04:11.36837275 +0000 UTC m=+20.771296688" observedRunningTime="2026-04-21 10:04:12.352664022 +0000 UTC m=+21.755587980" watchObservedRunningTime="2026-04-21 10:04:12.352769597 +0000 UTC m=+21.755693554" Apr 21 10:04:12.365742 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.365708 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8gn9x" podStartSLOduration=3.719438213 podStartE2EDuration="21.365696446s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:03:53.689460056 +0000 UTC m=+3.092383990" lastFinishedPulling="2026-04-21 10:04:11.335718282 +0000 UTC m=+20.738642223" observedRunningTime="2026-04-21 10:04:12.365690269 +0000 UTC m=+21.768614226" watchObservedRunningTime="2026-04-21 10:04:12.365696446 +0000 UTC m=+21.768620403" Apr 21 10:04:12.378529 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.378497 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-w982h" podStartSLOduration=3.726794055 podStartE2EDuration="21.378487346s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:03:53.683807931 +0000 UTC m=+3.086731878" lastFinishedPulling="2026-04-21 10:04:11.335501219 +0000 UTC m=+20.738425169" observedRunningTime="2026-04-21 10:04:12.378290789 +0000 UTC m=+21.781214749" watchObservedRunningTime="2026-04-21 10:04:12.378487346 +0000 UTC m=+21.781411303" Apr 21 10:04:12.566278 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:12.566247 2543 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 10:04:13.080962 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:13.080842 2543 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T10:04:12.566273803Z","UUID":"9786441b-a5f9-4ac3-b06a-5f84f01fd268","Handler":null,"Name":"","Endpoint":""} Apr 21 10:04:13.083404 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:13.083376 2543 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 10:04:13.083404 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:13.083409 2543 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 10:04:13.146804 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:13.146769 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:13.146965 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:13.146895 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:13.146965 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:13.146931 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:13.147156 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:13.147036 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:13.304248 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:13.304208 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" event={"ID":"80502d68-e1ff-4bf8-a351-5ec764c77341","Type":"ContainerStarted","Data":"8c356328b657475eb346f2de7e2c77c46a9bb3e659f748e3cc10c03ad98e8996"} Apr 21 10:04:13.305737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:13.305693 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-x99h9" event={"ID":"e382252c-0c4b-41ed-b8ca-d5f651a559a8","Type":"ContainerStarted","Data":"e656062fd4b434f6a7582b1ea969126c1035c4344a632546f98d2eea38e8ffc7"} Apr 21 10:04:13.320326 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:13.320281 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-x99h9" podStartSLOduration=4.657954959 podStartE2EDuration="22.320270291s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:03:53.695964897 +0000 UTC m=+3.098888833" lastFinishedPulling="2026-04-21 10:04:11.358280225 +0000 UTC m=+20.761204165" observedRunningTime="2026-04-21 10:04:13.320034401 +0000 UTC m=+22.722958358" watchObservedRunningTime="2026-04-21 10:04:13.320270291 +0000 UTC m=+22.723194247" Apr 21 10:04:14.147428 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:14.147400 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:14.147608 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:14.147570 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:14.311693 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:14.311663 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" event={"ID":"324b8f2e-48a1-43e9-b457-d61a6b4fe663","Type":"ContainerStarted","Data":"9d710b7e67c0a1ae32939def2d08c79a503261c6c2b39ec2f72aca6d7732695d"} Apr 21 10:04:14.313967 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:14.313942 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" event={"ID":"80502d68-e1ff-4bf8-a351-5ec764c77341","Type":"ContainerStarted","Data":"a526699ba848657017d5e8f22f90f4aacf26dbfa1181bf26d372437447ec5a7a"} Apr 21 10:04:14.334488 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:14.334438 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zkw7h" podStartSLOduration=2.890934238 podStartE2EDuration="23.334420089s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:03:53.695614706 +0000 UTC m=+3.098538642" lastFinishedPulling="2026-04-21 10:04:14.139100545 +0000 UTC m=+23.542024493" observedRunningTime="2026-04-21 10:04:14.333606019 +0000 UTC m=+23.736529975" watchObservedRunningTime="2026-04-21 10:04:14.334420089 +0000 UTC m=+23.737344047" Apr 21 10:04:15.146787 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:15.146754 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:15.146969 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:15.146879 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:15.146969 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:15.146949 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:15.147080 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:15.147038 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:16.147570 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:16.147382 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:16.147992 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:16.147664 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:16.750255 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:16.750225 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-w982h" Apr 21 10:04:16.750829 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:16.750810 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-w982h" Apr 21 10:04:17.147532 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.147502 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:17.147751 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.147502 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:17.147751 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:17.147642 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:17.147751 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:17.147739 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:17.322687 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.322653 2543 generic.go:358] "Generic (PLEG): container finished" podID="9dca93be-3209-499b-9769-928f3d717103" containerID="9fbdb337590043dd2b3897ed76f515a33410b8d48969bd9707babddb83e50047" exitCode=0 Apr 21 10:04:17.322898 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.322736 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" event={"ID":"9dca93be-3209-499b-9769-928f3d717103","Type":"ContainerDied","Data":"9fbdb337590043dd2b3897ed76f515a33410b8d48969bd9707babddb83e50047"} Apr 21 10:04:17.326005 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.325977 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" event={"ID":"324b8f2e-48a1-43e9-b457-d61a6b4fe663","Type":"ContainerStarted","Data":"68a304bb3859f74c1d30b66f28eca384ab952406f39c3c33b43a39854e99b727"} Apr 21 10:04:17.326376 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.326353 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:04:17.326459 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.326380 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:04:17.326459 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.326393 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:04:17.341681 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.341517 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:04:17.341794 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.341734 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:04:17.370845 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:17.370800 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" podStartSLOduration=8.62210549 podStartE2EDuration="26.370789358s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:03:53.688610588 +0000 UTC m=+3.091534523" lastFinishedPulling="2026-04-21 10:04:11.437294439 +0000 UTC m=+20.840218391" observedRunningTime="2026-04-21 10:04:17.369199867 +0000 UTC m=+26.772123825" watchObservedRunningTime="2026-04-21 10:04:17.370789358 +0000 UTC m=+26.773713314" Apr 21 10:04:18.146714 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:18.146688 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:18.146847 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:18.146789 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:18.329364 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:18.329335 2543 generic.go:358] "Generic (PLEG): container finished" podID="9dca93be-3209-499b-9769-928f3d717103" containerID="c515e0fbf4b5c2c593db1c0488bfd22e15e2b09b4e1a720af54da30d39be2503" exitCode=0 Apr 21 10:04:18.329861 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:18.329426 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" event={"ID":"9dca93be-3209-499b-9769-928f3d717103","Type":"ContainerDied","Data":"c515e0fbf4b5c2c593db1c0488bfd22e15e2b09b4e1a720af54da30d39be2503"} Apr 21 10:04:18.536410 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:18.536326 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5d6q8"] Apr 21 10:04:18.536587 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:18.536446 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:18.536587 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:18.536533 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:18.541244 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:18.541165 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7rjs4"] Apr 21 10:04:18.541366 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:18.541299 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:18.541426 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:18.541394 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:18.541881 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:18.541860 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lnpnt"] Apr 21 10:04:18.541968 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:18.541951 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:18.542027 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:18.542013 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:19.333472 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:19.333383 2543 generic.go:358] "Generic (PLEG): container finished" podID="9dca93be-3209-499b-9769-928f3d717103" containerID="c0ab49fe6949212ebcfaad86cc9b0c7ad9e5625b8e4cfed8ac485bb2217116b5" exitCode=0 Apr 21 10:04:19.334170 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:19.333465 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" event={"ID":"9dca93be-3209-499b-9769-928f3d717103","Type":"ContainerDied","Data":"c0ab49fe6949212ebcfaad86cc9b0c7ad9e5625b8e4cfed8ac485bb2217116b5"} Apr 21 10:04:20.146619 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:20.146590 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:20.146619 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:20.146604 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:20.146619 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:20.146590 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:20.146896 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:20.146729 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:20.146896 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:20.146777 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:20.146896 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:20.146853 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:22.147541 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:22.147500 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:22.147945 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:22.147573 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:22.147945 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:22.147516 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:22.147945 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:22.147651 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:22.147945 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:22.147713 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:22.147945 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:22.147792 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:23.038457 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:23.038420 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-w982h" Apr 21 10:04:23.038641 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:23.038597 2543 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 10:04:23.039247 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:23.039227 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-w982h" Apr 21 10:04:24.146717 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.146682 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:24.147104 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.146790 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d6q8" podUID="c91615b3-acf6-4090-ad49-f307699df3ec" Apr 21 10:04:24.147104 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.146696 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:24.147104 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.146682 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:24.147104 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.146893 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:04:24.147104 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.147012 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lnpnt" podUID="96143cfb-12f0-4d9d-bf83-e55429b937c9" Apr 21 10:04:24.412481 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.412452 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeReady" Apr 21 10:04:24.412648 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.412605 2543 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 10:04:24.467737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.467705 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bvmct"] Apr 21 10:04:24.503586 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.503501 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zbcx6"] Apr 21 10:04:24.505223 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.504014 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.507659 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.507245 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 10:04:24.508620 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.508594 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-g7j6z\"" Apr 21 10:04:24.508726 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.508644 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 10:04:24.523104 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.523076 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bvmct"] Apr 21 10:04:24.523104 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.523107 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zbcx6"] Apr 21 10:04:24.523272 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.523230 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:24.526842 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.526824 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 10:04:24.527134 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.527118 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 10:04:24.527390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.527372 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9st8k\"" Apr 21 10:04:24.527476 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.527376 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 10:04:24.673231 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.673145 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.673231 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.673217 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p7z4\" (UniqueName: \"kubernetes.io/projected/8165d059-3605-4311-b7f3-ad6cd4ab874b-kube-api-access-6p7z4\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.673443 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.673237 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2pvt\" (UniqueName: \"kubernetes.io/projected/8c2132b5-17f3-4ef3-89d6-07554e363088-kube-api-access-s2pvt\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:24.673443 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.673295 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:24.673443 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.673332 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8165d059-3605-4311-b7f3-ad6cd4ab874b-tmp-dir\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.673443 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.673371 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8165d059-3605-4311-b7f3-ad6cd4ab874b-config-volume\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.774505 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.774466 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8165d059-3605-4311-b7f3-ad6cd4ab874b-tmp-dir\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.774737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.774527 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8165d059-3605-4311-b7f3-ad6cd4ab874b-config-volume\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.774737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.774600 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.774737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.774633 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p7z4\" (UniqueName: \"kubernetes.io/projected/8165d059-3605-4311-b7f3-ad6cd4ab874b-kube-api-access-6p7z4\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.774737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.774649 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2pvt\" (UniqueName: \"kubernetes.io/projected/8c2132b5-17f3-4ef3-89d6-07554e363088-kube-api-access-s2pvt\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:24.774737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.774675 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:24.774907 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.774768 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:24.774907 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.774774 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:24.774907 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.774815 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert podName:8c2132b5-17f3-4ef3-89d6-07554e363088 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:25.274802006 +0000 UTC m=+34.677725941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert") pod "ingress-canary-zbcx6" (UID: "8c2132b5-17f3-4ef3-89d6-07554e363088") : secret "canary-serving-cert" not found Apr 21 10:04:24.774907 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.774847 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls podName:8165d059-3605-4311-b7f3-ad6cd4ab874b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:25.274828354 +0000 UTC m=+34.677752293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls") pod "dns-default-bvmct" (UID: "8165d059-3605-4311-b7f3-ad6cd4ab874b") : secret "dns-default-metrics-tls" not found Apr 21 10:04:24.774907 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.774848 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8165d059-3605-4311-b7f3-ad6cd4ab874b-tmp-dir\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.775183 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.775153 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8165d059-3605-4311-b7f3-ad6cd4ab874b-config-volume\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.787805 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.787783 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2pvt\" (UniqueName: \"kubernetes.io/projected/8c2132b5-17f3-4ef3-89d6-07554e363088-kube-api-access-s2pvt\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:24.788233 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.788215 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p7z4\" (UniqueName: \"kubernetes.io/projected/8165d059-3605-4311-b7f3-ad6cd4ab874b-kube-api-access-6p7z4\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:24.875987 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.875950 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:24.876162 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.876125 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:24.876235 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.876207 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs podName:e8bb2bdc-f702-42cf-a999-1816acd364ba nodeName:}" failed. No retries permitted until 2026-04-21 10:04:56.876187577 +0000 UTC m=+66.279111528 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs") pod "network-metrics-daemon-7rjs4" (UID: "e8bb2bdc-f702-42cf-a999-1816acd364ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:24.976992 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:24.976906 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:24.977113 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.977075 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:24.977113 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.977096 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:24.977113 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.977108 2543 projected.go:194] Error preparing data for projected volume kube-api-access-g97q2 for pod openshift-network-diagnostics/network-check-target-5d6q8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:24.977209 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:24.977169 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2 podName:c91615b3-acf6-4090-ad49-f307699df3ec nodeName:}" failed. No retries permitted until 2026-04-21 10:04:56.977155247 +0000 UTC m=+66.380079182 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-g97q2" (UniqueName: "kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2") pod "network-check-target-5d6q8" (UID: "c91615b3-acf6-4090-ad49-f307699df3ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:25.279197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.279119 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:25.279197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.279174 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:25.279694 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:25.279267 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:25.279694 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:25.279305 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:25.279694 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:25.279318 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert podName:8c2132b5-17f3-4ef3-89d6-07554e363088 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:26.279304768 +0000 UTC m=+35.682228704 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert") pod "ingress-canary-zbcx6" (UID: "8c2132b5-17f3-4ef3-89d6-07554e363088") : secret "canary-serving-cert" not found Apr 21 10:04:25.279694 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:25.279378 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls podName:8165d059-3605-4311-b7f3-ad6cd4ab874b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:26.279362588 +0000 UTC m=+35.682286524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls") pod "dns-default-bvmct" (UID: "8165d059-3605-4311-b7f3-ad6cd4ab874b") : secret "dns-default-metrics-tls" not found Apr 21 10:04:25.439849 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.439825 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66"] Apr 21 10:04:25.467951 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.467922 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66"] Apr 21 10:04:25.468062 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.468029 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.470499 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.470479 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 10:04:25.470499 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.470495 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 10:04:25.470677 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.470492 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 10:04:25.470733 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.470693 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 10:04:25.470866 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.470837 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 10:04:25.470928 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.470899 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 10:04:25.471454 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.471428 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 10:04:25.581663 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.581636 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-ca\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.581779 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.581675 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-hub\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.581779 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.581708 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.581898 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.581813 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/186db918-dd5c-4e62-9acf-d0c7089b3e52-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.581898 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.581841 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.581898 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.581868 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv9s2\" (UniqueName: \"kubernetes.io/projected/186db918-dd5c-4e62-9acf-d0c7089b3e52-kube-api-access-vv9s2\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.683029 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.682992 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/186db918-dd5c-4e62-9acf-d0c7089b3e52-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.683029 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.683032 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.683251 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.683051 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv9s2\" (UniqueName: \"kubernetes.io/projected/186db918-dd5c-4e62-9acf-d0c7089b3e52-kube-api-access-vv9s2\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.683251 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.683119 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-ca\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.683251 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.683148 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-hub\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.683251 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.683181 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.683774 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.683746 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/186db918-dd5c-4e62-9acf-d0c7089b3e52-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.686140 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.686112 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-ca\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.686365 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.686344 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.686407 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.686345 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.686441 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.686411 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/186db918-dd5c-4e62-9acf-d0c7089b3e52-hub\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.693688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.693666 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv9s2\" (UniqueName: \"kubernetes.io/projected/186db918-dd5c-4e62-9acf-d0c7089b3e52-kube-api-access-vv9s2\") pod \"cluster-proxy-proxy-agent-b88d4974b-9mf66\" (UID: \"186db918-dd5c-4e62-9acf-d0c7089b3e52\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.794050 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.794017 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:04:25.977754 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:25.977588 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66"] Apr 21 10:04:25.982051 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:04:25.982021 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod186db918_dd5c_4e62_9acf_d0c7089b3e52.slice/crio-39c94c7644b0772098631b7820f0abb29e870caa0851aedca6e4f63bf8c52a5a WatchSource:0}: Error finding container 39c94c7644b0772098631b7820f0abb29e870caa0851aedca6e4f63bf8c52a5a: Status 404 returned error can't find the container with id 39c94c7644b0772098631b7820f0abb29e870caa0851aedca6e4f63bf8c52a5a Apr 21 10:04:26.147156 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.147081 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:26.147156 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.147124 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:26.147156 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.147150 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:26.150148 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.150094 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:26.150148 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.150115 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7zbjr\"" Apr 21 10:04:26.150148 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.150131 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 10:04:26.150385 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.150114 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:26.150385 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.150193 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:26.151366 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.151337 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bjzzq\"" Apr 21 10:04:26.287176 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.287134 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:26.287596 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.287224 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:26.287596 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:26.287283 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:26.287596 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:26.287326 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:26.287596 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:26.287357 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert podName:8c2132b5-17f3-4ef3-89d6-07554e363088 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:28.287340629 +0000 UTC m=+37.690264585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert") pod "ingress-canary-zbcx6" (UID: "8c2132b5-17f3-4ef3-89d6-07554e363088") : secret "canary-serving-cert" not found Apr 21 10:04:26.287596 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:26.287386 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls podName:8165d059-3605-4311-b7f3-ad6cd4ab874b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:28.287367976 +0000 UTC m=+37.690291911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls") pod "dns-default-bvmct" (UID: "8165d059-3605-4311-b7f3-ad6cd4ab874b") : secret "dns-default-metrics-tls" not found Apr 21 10:04:26.349748 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.349713 2543 generic.go:358] "Generic (PLEG): container finished" podID="9dca93be-3209-499b-9769-928f3d717103" containerID="56b8635296276f9df3791977af3266acd2e1bf337411c861201d6d4508c790f1" exitCode=0 Apr 21 10:04:26.349913 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.349794 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" event={"ID":"9dca93be-3209-499b-9769-928f3d717103","Type":"ContainerDied","Data":"56b8635296276f9df3791977af3266acd2e1bf337411c861201d6d4508c790f1"} Apr 21 10:04:26.350827 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:26.350613 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" event={"ID":"186db918-dd5c-4e62-9acf-d0c7089b3e52","Type":"ContainerStarted","Data":"39c94c7644b0772098631b7820f0abb29e870caa0851aedca6e4f63bf8c52a5a"} Apr 21 10:04:27.358112 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:27.358069 2543 generic.go:358] "Generic (PLEG): container finished" podID="9dca93be-3209-499b-9769-928f3d717103" containerID="e8297f7863e41448cb0a06ccb183572979b1b7907b50893e249ebe199c1b2f18" exitCode=0 Apr 21 10:04:27.358748 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:27.358134 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" event={"ID":"9dca93be-3209-499b-9769-928f3d717103","Type":"ContainerDied","Data":"e8297f7863e41448cb0a06ccb183572979b1b7907b50893e249ebe199c1b2f18"} Apr 21 10:04:28.203631 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:28.203527 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:28.207266 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:28.207232 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96143cfb-12f0-4d9d-bf83-e55429b937c9-original-pull-secret\") pod \"global-pull-secret-syncer-lnpnt\" (UID: \"96143cfb-12f0-4d9d-bf83-e55429b937c9\") " pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:28.266616 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:28.266582 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lnpnt" Apr 21 10:04:28.303889 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:28.303856 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:28.304056 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:28.303932 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:28.304056 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:28.304029 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:28.304181 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:28.304106 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert podName:8c2132b5-17f3-4ef3-89d6-07554e363088 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:32.304085492 +0000 UTC m=+41.707009445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert") pod "ingress-canary-zbcx6" (UID: "8c2132b5-17f3-4ef3-89d6-07554e363088") : secret "canary-serving-cert" not found Apr 21 10:04:28.304181 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:28.304033 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:28.304181 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:28.304171 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls podName:8165d059-3605-4311-b7f3-ad6cd4ab874b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:32.304154438 +0000 UTC m=+41.707078374 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls") pod "dns-default-bvmct" (UID: "8165d059-3605-4311-b7f3-ad6cd4ab874b") : secret "dns-default-metrics-tls" not found Apr 21 10:04:28.881526 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:28.881274 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lnpnt"] Apr 21 10:04:28.980147 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:04:28.980115 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96143cfb_12f0_4d9d_bf83_e55429b937c9.slice/crio-b5dbedcc8d2cb2d92b1b68a787910520e41dffe126de7294e35487857be57471 WatchSource:0}: Error finding container b5dbedcc8d2cb2d92b1b68a787910520e41dffe126de7294e35487857be57471: Status 404 returned error can't find the container with id b5dbedcc8d2cb2d92b1b68a787910520e41dffe126de7294e35487857be57471 Apr 21 10:04:29.365026 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:29.364993 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" event={"ID":"9dca93be-3209-499b-9769-928f3d717103","Type":"ContainerStarted","Data":"7a8c0be57ee51c754db7fedb52e36ce2f8d4c031bb01a593255b33563ddf799a"} Apr 21 10:04:29.366354 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:29.366324 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" event={"ID":"186db918-dd5c-4e62-9acf-d0c7089b3e52","Type":"ContainerStarted","Data":"1f298256f961bddf45b45e8d3787e73325fcad16e5afd9300228d36d913f59fb"} Apr 21 10:04:29.367272 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:29.367255 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lnpnt" event={"ID":"96143cfb-12f0-4d9d-bf83-e55429b937c9","Type":"ContainerStarted","Data":"b5dbedcc8d2cb2d92b1b68a787910520e41dffe126de7294e35487857be57471"} Apr 21 10:04:29.390381 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:29.390345 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tf4pw" podStartSLOduration=6.766986117 podStartE2EDuration="38.390334038s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:03:53.695739528 +0000 UTC m=+3.098663469" lastFinishedPulling="2026-04-21 10:04:25.319087454 +0000 UTC m=+34.722011390" observedRunningTime="2026-04-21 10:04:29.38913915 +0000 UTC m=+38.792063107" watchObservedRunningTime="2026-04-21 10:04:29.390334038 +0000 UTC m=+38.793257988" Apr 21 10:04:32.337843 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:32.337805 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:32.338286 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:32.337891 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:32.338286 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:32.337989 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:32.338286 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:32.338014 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:32.338286 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:32.338066 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls podName:8165d059-3605-4311-b7f3-ad6cd4ab874b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:40.338045723 +0000 UTC m=+49.740969662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls") pod "dns-default-bvmct" (UID: "8165d059-3605-4311-b7f3-ad6cd4ab874b") : secret "dns-default-metrics-tls" not found Apr 21 10:04:32.338286 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:32.338086 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert podName:8c2132b5-17f3-4ef3-89d6-07554e363088 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:40.338076314 +0000 UTC m=+49.741000255 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert") pod "ingress-canary-zbcx6" (UID: "8c2132b5-17f3-4ef3-89d6-07554e363088") : secret "canary-serving-cert" not found Apr 21 10:04:33.379318 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:33.379230 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" event={"ID":"186db918-dd5c-4e62-9acf-d0c7089b3e52","Type":"ContainerStarted","Data":"adde29f12b6ce02295faddae802d754f11225cd38f960737885f09fbc12aa2ab"} Apr 21 10:04:33.379318 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:33.379273 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" event={"ID":"186db918-dd5c-4e62-9acf-d0c7089b3e52","Type":"ContainerStarted","Data":"eb26632904759856e9e20b400e4520c79fdcb71c11216351251dbf729470d526"} Apr 21 10:04:33.404611 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:33.404568 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" podStartSLOduration=1.296754134 podStartE2EDuration="8.404535158s" podCreationTimestamp="2026-04-21 10:04:25 +0000 UTC" firstStartedPulling="2026-04-21 10:04:25.983599782 +0000 UTC m=+35.386523716" lastFinishedPulling="2026-04-21 10:04:33.091380805 +0000 UTC m=+42.494304740" observedRunningTime="2026-04-21 10:04:33.403101412 +0000 UTC m=+42.806025370" watchObservedRunningTime="2026-04-21 10:04:33.404535158 +0000 UTC m=+42.807459309" Apr 21 10:04:34.382169 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:34.382132 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lnpnt" event={"ID":"96143cfb-12f0-4d9d-bf83-e55429b937c9","Type":"ContainerStarted","Data":"ede61b731206c4604e3f0f7b70a6b74163a59e204f0050d41395b6841884eab4"} Apr 21 10:04:34.400393 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:34.400352 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lnpnt" podStartSLOduration=33.996113395 podStartE2EDuration="38.400337737s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:04:28.996871439 +0000 UTC m=+38.399795377" lastFinishedPulling="2026-04-21 10:04:33.401095767 +0000 UTC m=+42.804019719" observedRunningTime="2026-04-21 10:04:34.400314731 +0000 UTC m=+43.803238692" watchObservedRunningTime="2026-04-21 10:04:34.400337737 +0000 UTC m=+43.803261687" Apr 21 10:04:40.401664 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:40.401630 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:40.402020 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:40.401678 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:40.402020 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:40.401764 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:40.402020 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:40.401773 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:40.402020 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:40.401812 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert podName:8c2132b5-17f3-4ef3-89d6-07554e363088 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:56.401799934 +0000 UTC m=+65.804723869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert") pod "ingress-canary-zbcx6" (UID: "8c2132b5-17f3-4ef3-89d6-07554e363088") : secret "canary-serving-cert" not found Apr 21 10:04:40.402020 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:40.401824 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls podName:8165d059-3605-4311-b7f3-ad6cd4ab874b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:56.401818791 +0000 UTC m=+65.804742725 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls") pod "dns-default-bvmct" (UID: "8165d059-3605-4311-b7f3-ad6cd4ab874b") : secret "dns-default-metrics-tls" not found Apr 21 10:04:49.344093 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:49.344066 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hpb6h" Apr 21 10:04:56.415481 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:56.415423 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:04:56.416053 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:56.415527 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:04:56.416053 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:56.415623 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:56.416053 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:56.415666 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:56.416053 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:56.415722 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls podName:8165d059-3605-4311-b7f3-ad6cd4ab874b nodeName:}" failed. No retries permitted until 2026-04-21 10:05:28.415703795 +0000 UTC m=+97.818627750 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls") pod "dns-default-bvmct" (UID: "8165d059-3605-4311-b7f3-ad6cd4ab874b") : secret "dns-default-metrics-tls" not found Apr 21 10:04:56.416053 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:56.415735 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert podName:8c2132b5-17f3-4ef3-89d6-07554e363088 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:28.415728952 +0000 UTC m=+97.818652887 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert") pod "ingress-canary-zbcx6" (UID: "8c2132b5-17f3-4ef3-89d6-07554e363088") : secret "canary-serving-cert" not found Apr 21 10:04:56.919181 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:56.919144 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:04:56.922685 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:56.922667 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:56.929503 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:56.929490 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:04:56.929612 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:04:56.929601 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs podName:e8bb2bdc-f702-42cf-a999-1816acd364ba nodeName:}" failed. No retries permitted until 2026-04-21 10:06:00.929584447 +0000 UTC m=+130.332508385 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs") pod "network-metrics-daemon-7rjs4" (UID: "e8bb2bdc-f702-42cf-a999-1816acd364ba") : secret "metrics-daemon-secret" not found Apr 21 10:04:57.020452 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:57.020419 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:57.024503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:57.024464 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:57.033631 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:57.033608 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:57.044258 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:57.044232 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g97q2\" (UniqueName: \"kubernetes.io/projected/c91615b3-acf6-4090-ad49-f307699df3ec-kube-api-access-g97q2\") pod \"network-check-target-5d6q8\" (UID: \"c91615b3-acf6-4090-ad49-f307699df3ec\") " pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:57.078583 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:57.078558 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bjzzq\"" Apr 21 10:04:57.082030 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:57.082016 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:04:57.210216 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:57.210130 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5d6q8"] Apr 21 10:04:57.214645 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:04:57.214610 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc91615b3_acf6_4090_ad49_f307699df3ec.slice/crio-5d6a4baff488b4e3eb0a23478018472f2c3dd29967985d0c6b4d05fcd1655319 WatchSource:0}: Error finding container 5d6a4baff488b4e3eb0a23478018472f2c3dd29967985d0c6b4d05fcd1655319: Status 404 returned error can't find the container with id 5d6a4baff488b4e3eb0a23478018472f2c3dd29967985d0c6b4d05fcd1655319 Apr 21 10:04:57.425734 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:04:57.425701 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5d6q8" event={"ID":"c91615b3-acf6-4090-ad49-f307699df3ec","Type":"ContainerStarted","Data":"5d6a4baff488b4e3eb0a23478018472f2c3dd29967985d0c6b4d05fcd1655319"} Apr 21 10:05:00.432483 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:05:00.432453 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5d6q8" event={"ID":"c91615b3-acf6-4090-ad49-f307699df3ec","Type":"ContainerStarted","Data":"5192eebf924022f529d5d2ca88285f1fbf7f8b5c34e33577c2ed975c06fed433"} Apr 21 10:05:00.432959 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:05:00.432582 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:05:00.451257 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:05:00.451203 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5d6q8" podStartSLOduration=66.840544414 podStartE2EDuration="1m9.451188766s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:04:57.216499591 +0000 UTC m=+66.619423526" lastFinishedPulling="2026-04-21 10:04:59.827143931 +0000 UTC m=+69.230067878" observedRunningTime="2026-04-21 10:05:00.450975552 +0000 UTC m=+69.853899513" watchObservedRunningTime="2026-04-21 10:05:00.451188766 +0000 UTC m=+69.854112763" Apr 21 10:05:28.459774 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:05:28.459729 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:05:28.460200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:05:28.459795 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:05:28.460200 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:05:28.459888 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:05:28.460200 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:05:28.459892 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:05:28.460200 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:05:28.459942 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert podName:8c2132b5-17f3-4ef3-89d6-07554e363088 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:32.459927098 +0000 UTC m=+161.862851033 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert") pod "ingress-canary-zbcx6" (UID: "8c2132b5-17f3-4ef3-89d6-07554e363088") : secret "canary-serving-cert" not found Apr 21 10:05:28.460200 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:05:28.459962 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls podName:8165d059-3605-4311-b7f3-ad6cd4ab874b nodeName:}" failed. No retries permitted until 2026-04-21 10:06:32.459947843 +0000 UTC m=+161.862871778 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls") pod "dns-default-bvmct" (UID: "8165d059-3605-4311-b7f3-ad6cd4ab874b") : secret "dns-default-metrics-tls" not found Apr 21 10:05:31.436837 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:05:31.436801 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5d6q8" Apr 21 10:06:00.976279 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:00.976219 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:06:00.976796 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:06:00.976343 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:06:00.976796 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:06:00.976398 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs podName:e8bb2bdc-f702-42cf-a999-1816acd364ba nodeName:}" failed. No retries permitted until 2026-04-21 10:08:02.976383222 +0000 UTC m=+252.379307157 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs") pod "network-metrics-daemon-7rjs4" (UID: "e8bb2bdc-f702-42cf-a999-1816acd364ba") : secret "metrics-daemon-secret" not found Apr 21 10:06:13.165996 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.165965 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw"] Apr 21 10:06:13.180378 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.180354 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw"] Apr 21 10:06:13.180489 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.180464 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.183765 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.183743 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:13.183895 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.183743 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 10:06:13.184631 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.184610 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 10:06:13.184877 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.184625 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:13.184877 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.184627 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-smlb2\"" Apr 21 10:06:13.259756 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.259722 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/973c1dc9-c2d7-44ac-a3df-ae04fae54595-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5zwgw\" (UID: \"973c1dc9-c2d7-44ac-a3df-ae04fae54595\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.259894 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.259810 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448nq\" (UniqueName: \"kubernetes.io/projected/973c1dc9-c2d7-44ac-a3df-ae04fae54595-kube-api-access-448nq\") pod \"service-ca-operator-d6fc45fc5-5zwgw\" (UID: \"973c1dc9-c2d7-44ac-a3df-ae04fae54595\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.259894 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.259831 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/973c1dc9-c2d7-44ac-a3df-ae04fae54595-config\") pod \"service-ca-operator-d6fc45fc5-5zwgw\" (UID: \"973c1dc9-c2d7-44ac-a3df-ae04fae54595\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.360724 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.360692 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-448nq\" (UniqueName: \"kubernetes.io/projected/973c1dc9-c2d7-44ac-a3df-ae04fae54595-kube-api-access-448nq\") pod \"service-ca-operator-d6fc45fc5-5zwgw\" (UID: \"973c1dc9-c2d7-44ac-a3df-ae04fae54595\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.360806 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.360731 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/973c1dc9-c2d7-44ac-a3df-ae04fae54595-config\") pod \"service-ca-operator-d6fc45fc5-5zwgw\" (UID: \"973c1dc9-c2d7-44ac-a3df-ae04fae54595\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.360806 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.360763 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/973c1dc9-c2d7-44ac-a3df-ae04fae54595-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5zwgw\" (UID: \"973c1dc9-c2d7-44ac-a3df-ae04fae54595\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.361935 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.361904 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/973c1dc9-c2d7-44ac-a3df-ae04fae54595-config\") pod \"service-ca-operator-d6fc45fc5-5zwgw\" (UID: \"973c1dc9-c2d7-44ac-a3df-ae04fae54595\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.363116 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.363088 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/973c1dc9-c2d7-44ac-a3df-ae04fae54595-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5zwgw\" (UID: \"973c1dc9-c2d7-44ac-a3df-ae04fae54595\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.368275 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.368256 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-448nq\" (UniqueName: \"kubernetes.io/projected/973c1dc9-c2d7-44ac-a3df-ae04fae54595-kube-api-access-448nq\") pod \"service-ca-operator-d6fc45fc5-5zwgw\" (UID: \"973c1dc9-c2d7-44ac-a3df-ae04fae54595\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.491034 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.490930 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" Apr 21 10:06:13.602518 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:13.602485 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw"] Apr 21 10:06:13.605660 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:06:13.605622 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973c1dc9_c2d7_44ac_a3df_ae04fae54595.slice/crio-d4ce787441ae830bca54287bca8eb7751e0741c3b80ada897d4285c0ef2f2cce WatchSource:0}: Error finding container d4ce787441ae830bca54287bca8eb7751e0741c3b80ada897d4285c0ef2f2cce: Status 404 returned error can't find the container with id d4ce787441ae830bca54287bca8eb7751e0741c3b80ada897d4285c0ef2f2cce Apr 21 10:06:14.577004 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:14.576968 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" event={"ID":"973c1dc9-c2d7-44ac-a3df-ae04fae54595","Type":"ContainerStarted","Data":"d4ce787441ae830bca54287bca8eb7751e0741c3b80ada897d4285c0ef2f2cce"} Apr 21 10:06:15.580457 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:15.580370 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" event={"ID":"973c1dc9-c2d7-44ac-a3df-ae04fae54595","Type":"ContainerStarted","Data":"5db06704d5c151b0ddc42bf8ffd2d3b22008bfeed9437cf89869f690713f2e70"} Apr 21 10:06:15.596799 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:15.596665 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" podStartSLOduration=0.920267783 podStartE2EDuration="2.596646554s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="2026-04-21 10:06:13.607512451 +0000 UTC m=+143.010436390" lastFinishedPulling="2026-04-21 10:06:15.283891226 +0000 UTC m=+144.686815161" observedRunningTime="2026-04-21 10:06:15.596213736 +0000 UTC m=+144.999137693" watchObservedRunningTime="2026-04-21 10:06:15.596646554 +0000 UTC m=+144.999570512" Apr 21 10:06:21.026458 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:21.026431 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8gn9x_8d1b8400-a581-4d5c-8d5f-39cdbe726442/dns-node-resolver/0.log" Apr 21 10:06:22.234605 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:22.234573 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mbz9p_76addc89-f8bd-48f1-b215-22850711d8a8/node-ca/0.log" Apr 21 10:06:27.517663 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:06:27.517622 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-bvmct" podUID="8165d059-3605-4311-b7f3-ad6cd4ab874b" Apr 21 10:06:27.532704 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:06:27.532680 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zbcx6" podUID="8c2132b5-17f3-4ef3-89d6-07554e363088" Apr 21 10:06:27.614672 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:27.614644 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bvmct" Apr 21 10:06:29.160979 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:06:29.160943 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7rjs4" podUID="e8bb2bdc-f702-42cf-a999-1816acd364ba" Apr 21 10:06:32.498716 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:32.498675 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:06:32.499100 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:32.498737 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:06:32.501193 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:32.501166 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c2132b5-17f3-4ef3-89d6-07554e363088-cert\") pod \"ingress-canary-zbcx6\" (UID: \"8c2132b5-17f3-4ef3-89d6-07554e363088\") " pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:06:32.501496 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:32.501475 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8165d059-3605-4311-b7f3-ad6cd4ab874b-metrics-tls\") pod \"dns-default-bvmct\" (UID: \"8165d059-3605-4311-b7f3-ad6cd4ab874b\") " pod="openshift-dns/dns-default-bvmct" Apr 21 10:06:32.717748 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:32.717712 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-g7j6z\"" Apr 21 10:06:32.726227 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:32.726205 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bvmct" Apr 21 10:06:32.841603 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:32.841575 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bvmct"] Apr 21 10:06:32.844576 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:06:32.844531 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8165d059_3605_4311_b7f3_ad6cd4ab874b.slice/crio-3cfb2aeca1c40204de5ab2d4d758049aaabfd9dc4b49d773ddc75324267e093b WatchSource:0}: Error finding container 3cfb2aeca1c40204de5ab2d4d758049aaabfd9dc4b49d773ddc75324267e093b: Status 404 returned error can't find the container with id 3cfb2aeca1c40204de5ab2d4d758049aaabfd9dc4b49d773ddc75324267e093b Apr 21 10:06:33.628457 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:33.628424 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bvmct" event={"ID":"8165d059-3605-4311-b7f3-ad6cd4ab874b","Type":"ContainerStarted","Data":"3cfb2aeca1c40204de5ab2d4d758049aaabfd9dc4b49d773ddc75324267e093b"} Apr 21 10:06:35.634690 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:35.634650 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bvmct" event={"ID":"8165d059-3605-4311-b7f3-ad6cd4ab874b","Type":"ContainerStarted","Data":"21388bebfe763c396461069ede143bf77daf6dbc9cd3a1934731fe495bf0bd92"} Apr 21 10:06:35.634690 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:35.634686 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bvmct" event={"ID":"8165d059-3605-4311-b7f3-ad6cd4ab874b","Type":"ContainerStarted","Data":"824b9ba1d1d62bda032266959ecb1e85464c3ff0dcc8b7425e7db9496dafae33"} Apr 21 10:06:35.635102 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:35.634785 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bvmct" Apr 21 10:06:35.654008 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:35.653962 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bvmct" podStartSLOduration=129.958084318 podStartE2EDuration="2m11.653950056s" podCreationTimestamp="2026-04-21 10:04:24 +0000 UTC" firstStartedPulling="2026-04-21 10:06:32.846315089 +0000 UTC m=+162.249239024" lastFinishedPulling="2026-04-21 10:06:34.542180826 +0000 UTC m=+163.945104762" observedRunningTime="2026-04-21 10:06:35.653616882 +0000 UTC m=+165.056540839" watchObservedRunningTime="2026-04-21 10:06:35.653950056 +0000 UTC m=+165.056874013" Apr 21 10:06:38.292680 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.292634 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2swzj"] Apr 21 10:06:38.294818 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.294793 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.297620 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.297499 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 10:06:38.299535 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.299509 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 10:06:38.299535 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.299520 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wh5gv\"" Apr 21 10:06:38.299763 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.299514 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 10:06:38.300350 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.300333 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 10:06:38.310334 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.310303 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2swzj"] Apr 21 10:06:38.338613 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.338580 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-data-volume\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.338767 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.338630 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.338767 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.338681 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.338767 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.338731 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-crio-socket\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.338767 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.338761 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2pz9\" (UniqueName: \"kubernetes.io/projected/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-kube-api-access-s2pz9\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.394024 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.393990 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d5f489fc7-4vt77"] Apr 21 10:06:38.395934 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.395916 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.398357 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.398331 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 10:06:38.398501 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.398365 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 10:06:38.398501 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.398381 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 10:06:38.398628 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.398502 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nljzt\"" Apr 21 10:06:38.409011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.408984 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d5f489fc7-4vt77"] Apr 21 10:06:38.411315 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.411299 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 10:06:38.440027 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440001 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-installation-pull-secrets\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.440133 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440033 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-crio-socket\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.440133 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440056 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq58t\" (UniqueName: \"kubernetes.io/projected/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-kube-api-access-zq58t\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.440133 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440083 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.440294 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440146 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-crio-socket\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.440294 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440155 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-registry-tls\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.440294 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440189 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-bound-sa-token\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.440294 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440221 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-ca-trust-extracted\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.440294 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440260 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.440522 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440335 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-trusted-ca\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.440522 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440365 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-image-registry-private-configuration\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.440522 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440394 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-registry-certificates\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.440522 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440471 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2pz9\" (UniqueName: \"kubernetes.io/projected/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-kube-api-access-s2pz9\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.440737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440567 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-data-volume\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.440737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440623 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.440820 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.440798 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-data-volume\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.442691 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.442672 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.460228 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.460203 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2pz9\" (UniqueName: \"kubernetes.io/projected/baa2cf7c-1295-4216-ad7b-9dfb4f679b84-kube-api-access-s2pz9\") pod \"insights-runtime-extractor-2swzj\" (UID: \"baa2cf7c-1295-4216-ad7b-9dfb4f679b84\") " pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.541403 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.541369 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-installation-pull-secrets\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.541530 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.541420 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq58t\" (UniqueName: \"kubernetes.io/projected/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-kube-api-access-zq58t\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.541530 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.541475 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-registry-tls\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.541530 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.541497 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-bound-sa-token\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.541530 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.541526 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-ca-trust-extracted\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.541806 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.541601 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-trusted-ca\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.541806 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.541775 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-image-registry-private-configuration\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.541910 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.541817 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-registry-certificates\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.542041 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.542016 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-ca-trust-extracted\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.542731 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.542618 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-trusted-ca\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.542829 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.542795 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-registry-certificates\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.543999 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.543973 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-installation-pull-secrets\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.544192 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.544173 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-registry-tls\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.544335 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.544317 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-image-registry-private-configuration\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.550527 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.550504 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq58t\" (UniqueName: \"kubernetes.io/projected/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-kube-api-access-zq58t\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.550645 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.550573 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6c2913a-2e69-4f0a-afd1-391ff1fb6798-bound-sa-token\") pod \"image-registry-d5f489fc7-4vt77\" (UID: \"c6c2913a-2e69-4f0a-afd1-391ff1fb6798\") " pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.604414 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.604395 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2swzj" Apr 21 10:06:38.704864 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.704836 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:38.716381 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.716351 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2swzj"] Apr 21 10:06:38.719153 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:06:38.719127 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaa2cf7c_1295_4216_ad7b_9dfb4f679b84.slice/crio-08bec2ccf7655da021d6c4d18ff0d0c80c0e669a975ac0124e305eafc9a8032a WatchSource:0}: Error finding container 08bec2ccf7655da021d6c4d18ff0d0c80c0e669a975ac0124e305eafc9a8032a: Status 404 returned error can't find the container with id 08bec2ccf7655da021d6c4d18ff0d0c80c0e669a975ac0124e305eafc9a8032a Apr 21 10:06:38.831971 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:38.831901 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d5f489fc7-4vt77"] Apr 21 10:06:38.835000 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:06:38.834973 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6c2913a_2e69_4f0a_afd1_391ff1fb6798.slice/crio-4ff39a82140b23bc58cc49daa777084ee6e2a9ce6e1a90a309c70668eec6b300 WatchSource:0}: Error finding container 4ff39a82140b23bc58cc49daa777084ee6e2a9ce6e1a90a309c70668eec6b300: Status 404 returned error can't find the container with id 4ff39a82140b23bc58cc49daa777084ee6e2a9ce6e1a90a309c70668eec6b300 Apr 21 10:06:39.147337 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.147300 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:06:39.149876 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.149850 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9st8k\"" Apr 21 10:06:39.157806 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.157781 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbcx6" Apr 21 10:06:39.294063 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.294009 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zbcx6"] Apr 21 10:06:39.331736 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:06:39.331705 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c2132b5_17f3_4ef3_89d6_07554e363088.slice/crio-02050416267a04f460deed897436d1be52e9b588543693652cc4af17ede3ae7a WatchSource:0}: Error finding container 02050416267a04f460deed897436d1be52e9b588543693652cc4af17ede3ae7a: Status 404 returned error can't find the container with id 02050416267a04f460deed897436d1be52e9b588543693652cc4af17ede3ae7a Apr 21 10:06:39.647799 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.647762 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zbcx6" event={"ID":"8c2132b5-17f3-4ef3-89d6-07554e363088","Type":"ContainerStarted","Data":"02050416267a04f460deed897436d1be52e9b588543693652cc4af17ede3ae7a"} Apr 21 10:06:39.649204 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.649178 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2swzj" event={"ID":"baa2cf7c-1295-4216-ad7b-9dfb4f679b84","Type":"ContainerStarted","Data":"6d11f5cd2392166a27041389758431ac064105e0acc0115963b6e73ba20d382a"} Apr 21 10:06:39.649204 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.649207 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2swzj" event={"ID":"baa2cf7c-1295-4216-ad7b-9dfb4f679b84","Type":"ContainerStarted","Data":"2107ee8dd4948cf0118b7bee3b0e0797d5533339a3dca95d8635a89afaabc3b0"} Apr 21 10:06:39.649354 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.649216 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2swzj" event={"ID":"baa2cf7c-1295-4216-ad7b-9dfb4f679b84","Type":"ContainerStarted","Data":"08bec2ccf7655da021d6c4d18ff0d0c80c0e669a975ac0124e305eafc9a8032a"} Apr 21 10:06:39.650361 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.650331 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" event={"ID":"c6c2913a-2e69-4f0a-afd1-391ff1fb6798","Type":"ContainerStarted","Data":"2ff077f9f481c6bd63740240c0bccc180cadff02990a371432e0fda3d8304ccf"} Apr 21 10:06:39.650361 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.650358 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" event={"ID":"c6c2913a-2e69-4f0a-afd1-391ff1fb6798","Type":"ContainerStarted","Data":"4ff39a82140b23bc58cc49daa777084ee6e2a9ce6e1a90a309c70668eec6b300"} Apr 21 10:06:39.650479 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.650454 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:06:39.672035 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:39.669820 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" podStartSLOduration=1.669804574 podStartE2EDuration="1.669804574s" podCreationTimestamp="2026-04-21 10:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:39.668597732 +0000 UTC m=+169.071521690" watchObservedRunningTime="2026-04-21 10:06:39.669804574 +0000 UTC m=+169.072728540" Apr 21 10:06:41.659044 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:41.658953 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zbcx6" event={"ID":"8c2132b5-17f3-4ef3-89d6-07554e363088","Type":"ContainerStarted","Data":"e66cda9501573f9503f838d0c64a4d0d7270399d4dafd4f78d0692f6207e99cc"} Apr 21 10:06:41.660699 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:41.660676 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2swzj" event={"ID":"baa2cf7c-1295-4216-ad7b-9dfb4f679b84","Type":"ContainerStarted","Data":"6349de8c5bf7767508fc7c645a646012b7f3088a5e89ed2cd8d4f990e2a62a48"} Apr 21 10:06:41.675085 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:41.675041 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zbcx6" podStartSLOduration=135.616344636 podStartE2EDuration="2m17.67502971s" podCreationTimestamp="2026-04-21 10:04:24 +0000 UTC" firstStartedPulling="2026-04-21 10:06:39.333632111 +0000 UTC m=+168.736556049" lastFinishedPulling="2026-04-21 10:06:41.392317188 +0000 UTC m=+170.795241123" observedRunningTime="2026-04-21 10:06:41.674641576 +0000 UTC m=+171.077565537" watchObservedRunningTime="2026-04-21 10:06:41.67502971 +0000 UTC m=+171.077953666" Apr 21 10:06:41.690700 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:41.690655 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2swzj" podStartSLOduration=1.073056112 podStartE2EDuration="3.690643376s" podCreationTimestamp="2026-04-21 10:06:38 +0000 UTC" firstStartedPulling="2026-04-21 10:06:38.774728822 +0000 UTC m=+168.177652771" lastFinishedPulling="2026-04-21 10:06:41.392316087 +0000 UTC m=+170.795240035" observedRunningTime="2026-04-21 10:06:41.68973008 +0000 UTC m=+171.092654062" watchObservedRunningTime="2026-04-21 10:06:41.690643376 +0000 UTC m=+171.093567401" Apr 21 10:06:44.146795 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:44.146759 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:06:45.639991 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:45.639961 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bvmct" Apr 21 10:06:54.274777 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.274742 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76b85d9576-kvnkb"] Apr 21 10:06:54.280631 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.280614 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.283065 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.283044 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 10:06:54.283175 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.283154 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 10:06:54.283246 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.283207 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 10:06:54.284172 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.284156 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 10:06:54.284247 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.284157 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 10:06:54.284298 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.284283 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2kpnx\"" Apr 21 10:06:54.285337 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.285315 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 10:06:54.285437 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.285394 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 10:06:54.288191 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.288171 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b85d9576-kvnkb"] Apr 21 10:06:54.293373 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.293352 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 10:06:54.357525 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.357496 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-service-ca\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.357672 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.357565 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-config\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.357672 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.357588 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-oauth-serving-cert\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.357672 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.357612 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-oauth-config\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.357672 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.357644 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-serving-cert\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.357672 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.357666 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-trusted-ca-bundle\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.357858 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.357689 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdqz\" (UniqueName: \"kubernetes.io/projected/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-kube-api-access-vgdqz\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.458670 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.458636 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-trusted-ca-bundle\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.458826 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.458679 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdqz\" (UniqueName: \"kubernetes.io/projected/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-kube-api-access-vgdqz\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.458826 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.458709 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-service-ca\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.458826 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.458743 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-config\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.458978 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.458885 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-oauth-serving-cert\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.458978 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.458959 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-oauth-config\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.459079 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.459004 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-serving-cert\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.459459 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.459438 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-service-ca\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.459581 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.459495 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-trusted-ca-bundle\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.459581 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.459505 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-config\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.459581 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.459540 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-oauth-serving-cert\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.461481 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.461462 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-oauth-config\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.462113 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.462095 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-serving-cert\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.467054 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.467037 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdqz\" (UniqueName: \"kubernetes.io/projected/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-kube-api-access-vgdqz\") pod \"console-76b85d9576-kvnkb\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.593831 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.593802 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:06:54.708852 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:54.708823 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b85d9576-kvnkb"] Apr 21 10:06:54.711894 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:06:54.711859 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e972389_e9fd_40b9_96a8_d291f8dcc9d5.slice/crio-63f573185e63cbd0c61b790dc8c0cdb2da59a961da6d8746deb3a72a850c9c14 WatchSource:0}: Error finding container 63f573185e63cbd0c61b790dc8c0cdb2da59a961da6d8746deb3a72a850c9c14: Status 404 returned error can't find the container with id 63f573185e63cbd0c61b790dc8c0cdb2da59a961da6d8746deb3a72a850c9c14 Apr 21 10:06:55.701078 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:55.701033 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b85d9576-kvnkb" event={"ID":"1e972389-e9fd-40b9-96a8-d291f8dcc9d5","Type":"ContainerStarted","Data":"63f573185e63cbd0c61b790dc8c0cdb2da59a961da6d8746deb3a72a850c9c14"} Apr 21 10:06:57.708057 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:57.708030 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b85d9576-kvnkb" event={"ID":"1e972389-e9fd-40b9-96a8-d291f8dcc9d5","Type":"ContainerStarted","Data":"c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b"} Apr 21 10:06:57.734372 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:06:57.733827 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76b85d9576-kvnkb" podStartSLOduration=0.805159713 podStartE2EDuration="3.733809334s" podCreationTimestamp="2026-04-21 10:06:54 +0000 UTC" firstStartedPulling="2026-04-21 10:06:54.713580877 +0000 UTC m=+184.116504815" lastFinishedPulling="2026-04-21 10:06:57.642230501 +0000 UTC m=+187.045154436" observedRunningTime="2026-04-21 10:06:57.732772441 +0000 UTC m=+187.135696398" watchObservedRunningTime="2026-04-21 10:06:57.733809334 +0000 UTC m=+187.136733293" Apr 21 10:07:00.657448 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:00.657419 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-d5f489fc7-4vt77" Apr 21 10:07:02.927749 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:02.927714 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2kmxv"] Apr 21 10:07:02.931177 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:02.931155 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:02.936126 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:02.936101 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 10:07:02.936358 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:02.936341 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 10:07:02.936642 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:02.936620 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 10:07:02.936736 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:02.936646 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 10:07:02.936736 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:02.936651 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wtmfz\"" Apr 21 10:07:02.936736 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:02.936689 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 10:07:02.937204 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:02.937183 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 10:07:03.028148 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.028116 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-textfile\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.028296 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.028160 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-tls\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.028296 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.028192 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-sys\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.028296 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.028236 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-root\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.028296 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.028258 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-wtmp\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.028296 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.028284 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.028447 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.028310 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-metrics-client-ca\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.028447 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.028336 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns5pb\" (UniqueName: \"kubernetes.io/projected/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-kube-api-access-ns5pb\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.028447 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.028397 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-accelerators-collector-config\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.129685 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129650 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-textfile\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.129685 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129686 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-tls\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.129883 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129708 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-sys\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.129883 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129731 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-root\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.129883 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129781 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-root\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.129883 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129796 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-sys\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.129883 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:07:03.129831 2543 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 10:07:03.129883 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129840 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-wtmp\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.129883 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129875 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.130123 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:07:03.129900 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-tls podName:2bb04fe7-fa18-4824-9af9-59a002fdcc8b nodeName:}" failed. No retries permitted until 2026-04-21 10:07:03.629877272 +0000 UTC m=+193.032801223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-tls") pod "node-exporter-2kmxv" (UID: "2bb04fe7-fa18-4824-9af9-59a002fdcc8b") : secret "node-exporter-tls" not found Apr 21 10:07:03.130123 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129945 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-metrics-client-ca\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.130123 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.129996 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ns5pb\" (UniqueName: \"kubernetes.io/projected/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-kube-api-access-ns5pb\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.130123 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.130015 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-wtmp\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.130123 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.130035 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-textfile\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.130123 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.130070 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-accelerators-collector-config\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.130480 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.130452 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-metrics-client-ca\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.130526 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.130504 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-accelerators-collector-config\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.132237 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.132220 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.139072 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.139046 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns5pb\" (UniqueName: \"kubernetes.io/projected/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-kube-api-access-ns5pb\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.634600 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.634562 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-tls\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.637045 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.637025 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2bb04fe7-fa18-4824-9af9-59a002fdcc8b-node-exporter-tls\") pod \"node-exporter-2kmxv\" (UID: \"2bb04fe7-fa18-4824-9af9-59a002fdcc8b\") " pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.839523 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:03.839491 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2kmxv" Apr 21 10:07:03.848058 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:07:03.848027 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bb04fe7_fa18_4824_9af9_59a002fdcc8b.slice/crio-5cd795cf5fb62966c088cf19f81a4df8a91583f9a155f894613ea2904b706a8b WatchSource:0}: Error finding container 5cd795cf5fb62966c088cf19f81a4df8a91583f9a155f894613ea2904b706a8b: Status 404 returned error can't find the container with id 5cd795cf5fb62966c088cf19f81a4df8a91583f9a155f894613ea2904b706a8b Apr 21 10:07:04.594052 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:04.594027 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:07:04.594314 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:04.594058 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:07:04.598813 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:04.598793 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:07:04.730012 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:04.729981 2543 generic.go:358] "Generic (PLEG): container finished" podID="2bb04fe7-fa18-4824-9af9-59a002fdcc8b" containerID="3b66d9fa830e3a422860bcc5958d32128f817f49a498a612c61f64218c0f750a" exitCode=0 Apr 21 10:07:04.730141 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:04.730054 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2kmxv" event={"ID":"2bb04fe7-fa18-4824-9af9-59a002fdcc8b","Type":"ContainerDied","Data":"3b66d9fa830e3a422860bcc5958d32128f817f49a498a612c61f64218c0f750a"} Apr 21 10:07:04.730141 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:04.730088 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2kmxv" event={"ID":"2bb04fe7-fa18-4824-9af9-59a002fdcc8b","Type":"ContainerStarted","Data":"5cd795cf5fb62966c088cf19f81a4df8a91583f9a155f894613ea2904b706a8b"} Apr 21 10:07:04.734782 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:04.734731 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:07:05.733877 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:05.733845 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2kmxv" event={"ID":"2bb04fe7-fa18-4824-9af9-59a002fdcc8b","Type":"ContainerStarted","Data":"75b33bdcd85679aa35c4dc9344658b57e7ef66ef3328483ea1c03ebf85fd21ce"} Apr 21 10:07:05.734345 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:05.733888 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2kmxv" event={"ID":"2bb04fe7-fa18-4824-9af9-59a002fdcc8b","Type":"ContainerStarted","Data":"f4fb1a9a28cebed48154b8ce14ad7d87814af7eed4276712bfa0669bbee98c84"} Apr 21 10:07:05.755622 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:05.755581 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2kmxv" podStartSLOduration=3.081223026 podStartE2EDuration="3.75556691s" podCreationTimestamp="2026-04-21 10:07:02 +0000 UTC" firstStartedPulling="2026-04-21 10:07:03.85008732 +0000 UTC m=+193.253011270" lastFinishedPulling="2026-04-21 10:07:04.524431204 +0000 UTC m=+193.927355154" observedRunningTime="2026-04-21 10:07:05.754650344 +0000 UTC m=+195.157574304" watchObservedRunningTime="2026-04-21 10:07:05.75556691 +0000 UTC m=+195.158490861" Apr 21 10:07:05.795844 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:05.795786 2543 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" podUID="186db918-dd5c-4e62-9acf-d0c7089b3e52" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 10:07:07.887692 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.887662 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fd8487bcf-cptgr"] Apr 21 10:07:07.890676 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.890654 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:07.899051 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.899029 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd8487bcf-cptgr"] Apr 21 10:07:07.971410 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.971378 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-oauth-serving-cert\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:07.971593 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.971437 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-config\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:07.971593 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.971464 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-serving-cert\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:07.971593 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.971492 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-trusted-ca-bundle\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:07.971593 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.971583 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-oauth-config\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:07.971753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.971616 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcr87\" (UniqueName: \"kubernetes.io/projected/0311e99f-8952-4cd5-92e9-cf27363de3dc-kube-api-access-pcr87\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:07.971753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:07.971646 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-service-ca\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.072404 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.072368 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-service-ca\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.072515 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.072424 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-oauth-serving-cert\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.072515 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.072460 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-config\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.072651 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.072574 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-serving-cert\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.072651 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.072636 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-trusted-ca-bundle\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.072726 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.072695 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-oauth-config\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.072788 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.072727 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcr87\" (UniqueName: \"kubernetes.io/projected/0311e99f-8952-4cd5-92e9-cf27363de3dc-kube-api-access-pcr87\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.073162 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.073133 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-service-ca\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.073248 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.073183 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-config\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.073248 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.073183 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-oauth-serving-cert\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.073700 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.073676 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-trusted-ca-bundle\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.075234 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.075203 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-serving-cert\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.075327 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.075292 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-oauth-config\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.081383 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.081365 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcr87\" (UniqueName: \"kubernetes.io/projected/0311e99f-8952-4cd5-92e9-cf27363de3dc-kube-api-access-pcr87\") pod \"console-6fd8487bcf-cptgr\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.199967 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.199898 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:08.316962 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.316930 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd8487bcf-cptgr"] Apr 21 10:07:08.319818 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:07:08.319788 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0311e99f_8952_4cd5_92e9_cf27363de3dc.slice/crio-323a2034988f5a6086de3864b63b4583adb45e6bea8b1c83d85178ee2a9e942e WatchSource:0}: Error finding container 323a2034988f5a6086de3864b63b4583adb45e6bea8b1c83d85178ee2a9e942e: Status 404 returned error can't find the container with id 323a2034988f5a6086de3864b63b4583adb45e6bea8b1c83d85178ee2a9e942e Apr 21 10:07:08.743741 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.743698 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd8487bcf-cptgr" event={"ID":"0311e99f-8952-4cd5-92e9-cf27363de3dc","Type":"ContainerStarted","Data":"f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512"} Apr 21 10:07:08.743741 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.743745 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd8487bcf-cptgr" event={"ID":"0311e99f-8952-4cd5-92e9-cf27363de3dc","Type":"ContainerStarted","Data":"323a2034988f5a6086de3864b63b4583adb45e6bea8b1c83d85178ee2a9e942e"} Apr 21 10:07:08.760349 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:08.760302 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fd8487bcf-cptgr" podStartSLOduration=1.760286969 podStartE2EDuration="1.760286969s" podCreationTimestamp="2026-04-21 10:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:07:08.759435857 +0000 UTC m=+198.162359815" watchObservedRunningTime="2026-04-21 10:07:08.760286969 +0000 UTC m=+198.163210926" Apr 21 10:07:13.731746 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:13.731704 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fd8487bcf-cptgr"] Apr 21 10:07:15.795078 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:15.795043 2543 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" podUID="186db918-dd5c-4e62-9acf-d0c7089b3e52" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 10:07:18.200819 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:18.200782 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:24.434665 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:24.434623 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b85d9576-kvnkb"] Apr 21 10:07:25.794993 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:25.794957 2543 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" podUID="186db918-dd5c-4e62-9acf-d0c7089b3e52" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 10:07:25.795362 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:25.795020 2543 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" Apr 21 10:07:25.795514 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:25.795481 2543 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"adde29f12b6ce02295faddae802d754f11225cd38f960737885f09fbc12aa2ab"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 10:07:25.795570 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:25.795536 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" podUID="186db918-dd5c-4e62-9acf-d0c7089b3e52" containerName="service-proxy" containerID="cri-o://adde29f12b6ce02295faddae802d754f11225cd38f960737885f09fbc12aa2ab" gracePeriod=30 Apr 21 10:07:26.791403 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:26.791367 2543 generic.go:358] "Generic (PLEG): container finished" podID="186db918-dd5c-4e62-9acf-d0c7089b3e52" containerID="adde29f12b6ce02295faddae802d754f11225cd38f960737885f09fbc12aa2ab" exitCode=2 Apr 21 10:07:26.791610 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:26.791444 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" event={"ID":"186db918-dd5c-4e62-9acf-d0c7089b3e52","Type":"ContainerDied","Data":"adde29f12b6ce02295faddae802d754f11225cd38f960737885f09fbc12aa2ab"} Apr 21 10:07:26.791610 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:26.791481 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b88d4974b-9mf66" event={"ID":"186db918-dd5c-4e62-9acf-d0c7089b3e52","Type":"ContainerStarted","Data":"656a0f1c61ed99d6ad65de008b5e56f371a883f41b5fb4a18a7a198dc0ae979a"} Apr 21 10:07:38.751422 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:38.751366 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6fd8487bcf-cptgr" podUID="0311e99f-8952-4cd5-92e9-cf27363de3dc" containerName="console" containerID="cri-o://f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512" gracePeriod=15 Apr 21 10:07:38.986456 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:38.986436 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fd8487bcf-cptgr_0311e99f-8952-4cd5-92e9-cf27363de3dc/console/0.log" Apr 21 10:07:38.986587 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:38.986513 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:39.003501 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.003437 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-trusted-ca-bundle\") pod \"0311e99f-8952-4cd5-92e9-cf27363de3dc\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " Apr 21 10:07:39.003664 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.003511 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-oauth-config\") pod \"0311e99f-8952-4cd5-92e9-cf27363de3dc\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " Apr 21 10:07:39.003664 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.003562 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-serving-cert\") pod \"0311e99f-8952-4cd5-92e9-cf27363de3dc\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " Apr 21 10:07:39.003664 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.003587 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-oauth-serving-cert\") pod \"0311e99f-8952-4cd5-92e9-cf27363de3dc\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " Apr 21 10:07:39.003664 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.003617 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-service-ca\") pod \"0311e99f-8952-4cd5-92e9-cf27363de3dc\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " Apr 21 10:07:39.003895 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.003717 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-config\") pod \"0311e99f-8952-4cd5-92e9-cf27363de3dc\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " Apr 21 10:07:39.003895 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.003751 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcr87\" (UniqueName: \"kubernetes.io/projected/0311e99f-8952-4cd5-92e9-cf27363de3dc-kube-api-access-pcr87\") pod \"0311e99f-8952-4cd5-92e9-cf27363de3dc\" (UID: \"0311e99f-8952-4cd5-92e9-cf27363de3dc\") " Apr 21 10:07:39.004020 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.003931 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0311e99f-8952-4cd5-92e9-cf27363de3dc" (UID: "0311e99f-8952-4cd5-92e9-cf27363de3dc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:39.004457 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.004386 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-service-ca" (OuterVolumeSpecName: "service-ca") pod "0311e99f-8952-4cd5-92e9-cf27363de3dc" (UID: "0311e99f-8952-4cd5-92e9-cf27363de3dc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:39.004644 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.004597 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-config" (OuterVolumeSpecName: "console-config") pod "0311e99f-8952-4cd5-92e9-cf27363de3dc" (UID: "0311e99f-8952-4cd5-92e9-cf27363de3dc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:39.004644 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.004620 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0311e99f-8952-4cd5-92e9-cf27363de3dc" (UID: "0311e99f-8952-4cd5-92e9-cf27363de3dc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:39.006662 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.006634 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0311e99f-8952-4cd5-92e9-cf27363de3dc" (UID: "0311e99f-8952-4cd5-92e9-cf27363de3dc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:39.006782 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.006725 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0311e99f-8952-4cd5-92e9-cf27363de3dc" (UID: "0311e99f-8952-4cd5-92e9-cf27363de3dc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:39.006782 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.006766 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0311e99f-8952-4cd5-92e9-cf27363de3dc-kube-api-access-pcr87" (OuterVolumeSpecName: "kube-api-access-pcr87") pod "0311e99f-8952-4cd5-92e9-cf27363de3dc" (UID: "0311e99f-8952-4cd5-92e9-cf27363de3dc"). InnerVolumeSpecName "kube-api-access-pcr87". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:39.104972 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.104932 2543 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-oauth-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:39.104972 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.104962 2543 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-serving-cert\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:39.104972 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.104974 2543 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-oauth-serving-cert\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:39.104972 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.104983 2543 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-service-ca\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:39.105227 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.104992 2543 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-console-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:39.105227 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.105001 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pcr87\" (UniqueName: \"kubernetes.io/projected/0311e99f-8952-4cd5-92e9-cf27363de3dc-kube-api-access-pcr87\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:39.105227 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.105010 2543 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311e99f-8952-4cd5-92e9-cf27363de3dc-trusted-ca-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:39.827334 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.827307 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fd8487bcf-cptgr_0311e99f-8952-4cd5-92e9-cf27363de3dc/console/0.log" Apr 21 10:07:39.827761 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.827349 2543 generic.go:358] "Generic (PLEG): container finished" podID="0311e99f-8952-4cd5-92e9-cf27363de3dc" containerID="f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512" exitCode=2 Apr 21 10:07:39.827761 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.827385 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd8487bcf-cptgr" event={"ID":"0311e99f-8952-4cd5-92e9-cf27363de3dc","Type":"ContainerDied","Data":"f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512"} Apr 21 10:07:39.827761 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.827422 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd8487bcf-cptgr" event={"ID":"0311e99f-8952-4cd5-92e9-cf27363de3dc","Type":"ContainerDied","Data":"323a2034988f5a6086de3864b63b4583adb45e6bea8b1c83d85178ee2a9e942e"} Apr 21 10:07:39.827761 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.827421 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd8487bcf-cptgr" Apr 21 10:07:39.827761 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.827433 2543 scope.go:117] "RemoveContainer" containerID="f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512" Apr 21 10:07:39.835288 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.835270 2543 scope.go:117] "RemoveContainer" containerID="f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512" Apr 21 10:07:39.835533 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:07:39.835512 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512\": container with ID starting with f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512 not found: ID does not exist" containerID="f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512" Apr 21 10:07:39.835624 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.835561 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512"} err="failed to get container status \"f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512\": rpc error: code = NotFound desc = could not find container \"f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512\": container with ID starting with f84259e0ed1324aafa0258497c32b166635b2f3707efea81a7fcfacde1542512 not found: ID does not exist" Apr 21 10:07:39.844532 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.844512 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fd8487bcf-cptgr"] Apr 21 10:07:39.849340 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:39.849307 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6fd8487bcf-cptgr"] Apr 21 10:07:41.151070 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:41.151041 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0311e99f-8952-4cd5-92e9-cf27363de3dc" path="/var/lib/kubelet/pods/0311e99f-8952-4cd5-92e9-cf27363de3dc/volumes" Apr 21 10:07:41.834615 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:41.834579 2543 generic.go:358] "Generic (PLEG): container finished" podID="973c1dc9-c2d7-44ac-a3df-ae04fae54595" containerID="5db06704d5c151b0ddc42bf8ffd2d3b22008bfeed9437cf89869f690713f2e70" exitCode=0 Apr 21 10:07:41.834778 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:41.834631 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" event={"ID":"973c1dc9-c2d7-44ac-a3df-ae04fae54595","Type":"ContainerDied","Data":"5db06704d5c151b0ddc42bf8ffd2d3b22008bfeed9437cf89869f690713f2e70"} Apr 21 10:07:41.834946 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:41.834933 2543 scope.go:117] "RemoveContainer" containerID="5db06704d5c151b0ddc42bf8ffd2d3b22008bfeed9437cf89869f690713f2e70" Apr 21 10:07:42.838172 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:42.838126 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zwgw" event={"ID":"973c1dc9-c2d7-44ac-a3df-ae04fae54595","Type":"ContainerStarted","Data":"e05a52adeb1a991f61efe6f409e044212f26ba9288daac37c1578650353b2d23"} Apr 21 10:07:49.453489 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.453419 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76b85d9576-kvnkb" podUID="1e972389-e9fd-40b9-96a8-d291f8dcc9d5" containerName="console" containerID="cri-o://c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b" gracePeriod=15 Apr 21 10:07:49.682040 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.682017 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b85d9576-kvnkb_1e972389-e9fd-40b9-96a8-d291f8dcc9d5/console/0.log" Apr 21 10:07:49.682143 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.682077 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:07:49.785406 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785321 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-oauth-config\") pod \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " Apr 21 10:07:49.785406 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785356 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-trusted-ca-bundle\") pod \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " Apr 21 10:07:49.785406 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785375 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-serving-cert\") pod \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " Apr 21 10:07:49.785406 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785394 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-config\") pod \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " Apr 21 10:07:49.785753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785455 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgdqz\" (UniqueName: \"kubernetes.io/projected/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-kube-api-access-vgdqz\") pod \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " Apr 21 10:07:49.785753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785485 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-service-ca\") pod \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " Apr 21 10:07:49.785753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785507 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-oauth-serving-cert\") pod \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\" (UID: \"1e972389-e9fd-40b9-96a8-d291f8dcc9d5\") " Apr 21 10:07:49.785962 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785923 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1e972389-e9fd-40b9-96a8-d291f8dcc9d5" (UID: "1e972389-e9fd-40b9-96a8-d291f8dcc9d5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:49.785962 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785923 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-config" (OuterVolumeSpecName: "console-config") pod "1e972389-e9fd-40b9-96a8-d291f8dcc9d5" (UID: "1e972389-e9fd-40b9-96a8-d291f8dcc9d5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:49.786072 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785957 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-service-ca" (OuterVolumeSpecName: "service-ca") pod "1e972389-e9fd-40b9-96a8-d291f8dcc9d5" (UID: "1e972389-e9fd-40b9-96a8-d291f8dcc9d5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:49.786072 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.785997 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1e972389-e9fd-40b9-96a8-d291f8dcc9d5" (UID: "1e972389-e9fd-40b9-96a8-d291f8dcc9d5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:49.787765 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.787734 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1e972389-e9fd-40b9-96a8-d291f8dcc9d5" (UID: "1e972389-e9fd-40b9-96a8-d291f8dcc9d5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:49.787765 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.787741 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-kube-api-access-vgdqz" (OuterVolumeSpecName: "kube-api-access-vgdqz") pod "1e972389-e9fd-40b9-96a8-d291f8dcc9d5" (UID: "1e972389-e9fd-40b9-96a8-d291f8dcc9d5"). InnerVolumeSpecName "kube-api-access-vgdqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:49.787893 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.787809 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1e972389-e9fd-40b9-96a8-d291f8dcc9d5" (UID: "1e972389-e9fd-40b9-96a8-d291f8dcc9d5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:49.861044 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.861019 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b85d9576-kvnkb_1e972389-e9fd-40b9-96a8-d291f8dcc9d5/console/0.log" Apr 21 10:07:49.861194 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.861060 2543 generic.go:358] "Generic (PLEG): container finished" podID="1e972389-e9fd-40b9-96a8-d291f8dcc9d5" containerID="c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b" exitCode=2 Apr 21 10:07:49.861194 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.861096 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b85d9576-kvnkb" event={"ID":"1e972389-e9fd-40b9-96a8-d291f8dcc9d5","Type":"ContainerDied","Data":"c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b"} Apr 21 10:07:49.861194 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.861131 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b85d9576-kvnkb" event={"ID":"1e972389-e9fd-40b9-96a8-d291f8dcc9d5","Type":"ContainerDied","Data":"63f573185e63cbd0c61b790dc8c0cdb2da59a961da6d8746deb3a72a850c9c14"} Apr 21 10:07:49.861194 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.861134 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b85d9576-kvnkb" Apr 21 10:07:49.861194 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.861145 2543 scope.go:117] "RemoveContainer" containerID="c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b" Apr 21 10:07:49.869602 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.869580 2543 scope.go:117] "RemoveContainer" containerID="c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b" Apr 21 10:07:49.869836 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:07:49.869819 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b\": container with ID starting with c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b not found: ID does not exist" containerID="c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b" Apr 21 10:07:49.869887 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.869844 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b"} err="failed to get container status \"c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b\": rpc error: code = NotFound desc = could not find container \"c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b\": container with ID starting with c503734d026b938b8820eb00d558a7e84dc05629e21b73aae1f46f1c358e165b not found: ID does not exist" Apr 21 10:07:49.882032 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.882006 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b85d9576-kvnkb"] Apr 21 10:07:49.886173 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.886149 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76b85d9576-kvnkb"] Apr 21 10:07:49.886411 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.886393 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vgdqz\" (UniqueName: \"kubernetes.io/projected/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-kube-api-access-vgdqz\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:49.886452 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.886416 2543 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-service-ca\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:49.886452 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.886427 2543 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-oauth-serving-cert\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:49.886452 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.886435 2543 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-oauth-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:49.886452 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.886444 2543 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-trusted-ca-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:49.886452 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.886452 2543 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-serving-cert\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:49.886623 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:49.886461 2543 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e972389-e9fd-40b9-96a8-d291f8dcc9d5-console-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:07:51.150005 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:07:51.149973 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e972389-e9fd-40b9-96a8-d291f8dcc9d5" path="/var/lib/kubelet/pods/1e972389-e9fd-40b9-96a8-d291f8dcc9d5/volumes" Apr 21 10:08:03.072749 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:03.072712 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:08:03.075140 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:03.075115 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8bb2bdc-f702-42cf-a999-1816acd364ba-metrics-certs\") pod \"network-metrics-daemon-7rjs4\" (UID: \"e8bb2bdc-f702-42cf-a999-1816acd364ba\") " pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:08:03.350829 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:03.350793 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7zbjr\"" Apr 21 10:08:03.358681 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:03.358653 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rjs4" Apr 21 10:08:03.473490 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:03.473452 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7rjs4"] Apr 21 10:08:03.477028 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:08:03.476998 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8bb2bdc_f702_42cf_a999_1816acd364ba.slice/crio-b916e53f8c06e14383be8b1b1468aaccc67349640c8434522a56443e5377b112 WatchSource:0}: Error finding container b916e53f8c06e14383be8b1b1468aaccc67349640c8434522a56443e5377b112: Status 404 returned error can't find the container with id b916e53f8c06e14383be8b1b1468aaccc67349640c8434522a56443e5377b112 Apr 21 10:08:03.898423 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:03.898384 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7rjs4" event={"ID":"e8bb2bdc-f702-42cf-a999-1816acd364ba","Type":"ContainerStarted","Data":"b916e53f8c06e14383be8b1b1468aaccc67349640c8434522a56443e5377b112"} Apr 21 10:08:04.902447 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:04.902405 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7rjs4" event={"ID":"e8bb2bdc-f702-42cf-a999-1816acd364ba","Type":"ContainerStarted","Data":"9cca6d5189bf87203a7081350964039fde1bb3ee85cbb7374b487e210d1d5a49"} Apr 21 10:08:04.902447 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:04.902439 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7rjs4" event={"ID":"e8bb2bdc-f702-42cf-a999-1816acd364ba","Type":"ContainerStarted","Data":"f692fbc49bf763393d5c5b8fece363c2efe72147820403bba56bee1d4a778ac5"} Apr 21 10:08:04.918253 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:04.918189 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7rjs4" podStartSLOduration=252.951316567 podStartE2EDuration="4m13.918171291s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:08:03.478879771 +0000 UTC m=+252.881803706" lastFinishedPulling="2026-04-21 10:08:04.445734478 +0000 UTC m=+253.848658430" observedRunningTime="2026-04-21 10:08:04.917061537 +0000 UTC m=+254.319985496" watchObservedRunningTime="2026-04-21 10:08:04.918171291 +0000 UTC m=+254.321095257" Apr 21 10:08:21.066853 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.066812 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-96d4958b5-drlsc"] Apr 21 10:08:21.067341 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.067022 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0311e99f-8952-4cd5-92e9-cf27363de3dc" containerName="console" Apr 21 10:08:21.067341 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.067033 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="0311e99f-8952-4cd5-92e9-cf27363de3dc" containerName="console" Apr 21 10:08:21.067341 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.067049 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e972389-e9fd-40b9-96a8-d291f8dcc9d5" containerName="console" Apr 21 10:08:21.067341 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.067055 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e972389-e9fd-40b9-96a8-d291f8dcc9d5" containerName="console" Apr 21 10:08:21.067341 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.067096 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e972389-e9fd-40b9-96a8-d291f8dcc9d5" containerName="console" Apr 21 10:08:21.067341 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.067103 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="0311e99f-8952-4cd5-92e9-cf27363de3dc" containerName="console" Apr 21 10:08:21.070042 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.070024 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.073609 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.073586 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 10:08:21.073712 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.073676 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 10:08:21.074794 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.074779 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 10:08:21.074794 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.074794 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 10:08:21.074934 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.074815 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 10:08:21.074934 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.074875 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 10:08:21.074934 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.074881 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2kpnx\"" Apr 21 10:08:21.075145 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.075125 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 10:08:21.079607 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.079588 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 10:08:21.080592 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.080567 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-96d4958b5-drlsc"] Apr 21 10:08:21.200558 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.200522 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-oauth-config\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.200746 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.200573 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-service-ca\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.200746 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.200592 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-config\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.200746 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.200680 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-serving-cert\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.200746 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.200711 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-trusted-ca-bundle\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.200924 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.200767 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd7xt\" (UniqueName: \"kubernetes.io/projected/3b6d8d2c-693e-4edb-9709-9c56105eefbe-kube-api-access-cd7xt\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.200924 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.200794 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-oauth-serving-cert\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.301639 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.301602 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-oauth-config\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.301639 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.301643 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-service-ca\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.301865 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.301668 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-config\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.301865 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.301707 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-serving-cert\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.301865 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.301729 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-trusted-ca-bundle\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.301865 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.301763 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd7xt\" (UniqueName: \"kubernetes.io/projected/3b6d8d2c-693e-4edb-9709-9c56105eefbe-kube-api-access-cd7xt\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.301865 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.301831 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-oauth-serving-cert\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.302466 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.302432 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-config\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.302466 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.302432 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-service-ca\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.302650 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.302435 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-oauth-serving-cert\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.302762 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.302745 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-trusted-ca-bundle\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.304243 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.304215 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-oauth-config\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.304394 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.304378 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-serving-cert\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.315257 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.315238 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd7xt\" (UniqueName: \"kubernetes.io/projected/3b6d8d2c-693e-4edb-9709-9c56105eefbe-kube-api-access-cd7xt\") pod \"console-96d4958b5-drlsc\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.379302 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.379268 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:21.500316 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.500287 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-96d4958b5-drlsc"] Apr 21 10:08:21.503520 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:08:21.503492 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6d8d2c_693e_4edb_9709_9c56105eefbe.slice/crio-b5f8d2e67f0272995242829c0da2d1915e62d726c3d4a7b0b46f816edb8d0848 WatchSource:0}: Error finding container b5f8d2e67f0272995242829c0da2d1915e62d726c3d4a7b0b46f816edb8d0848: Status 404 returned error can't find the container with id b5f8d2e67f0272995242829c0da2d1915e62d726c3d4a7b0b46f816edb8d0848 Apr 21 10:08:21.953794 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.953751 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-96d4958b5-drlsc" event={"ID":"3b6d8d2c-693e-4edb-9709-9c56105eefbe","Type":"ContainerStarted","Data":"29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713"} Apr 21 10:08:21.953794 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.953796 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-96d4958b5-drlsc" event={"ID":"3b6d8d2c-693e-4edb-9709-9c56105eefbe","Type":"ContainerStarted","Data":"b5f8d2e67f0272995242829c0da2d1915e62d726c3d4a7b0b46f816edb8d0848"} Apr 21 10:08:21.971507 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:21.971467 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-96d4958b5-drlsc" podStartSLOduration=0.97145183 podStartE2EDuration="971.45183ms" podCreationTimestamp="2026-04-21 10:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:08:21.971071544 +0000 UTC m=+271.373995525" watchObservedRunningTime="2026-04-21 10:08:21.97145183 +0000 UTC m=+271.374375787" Apr 21 10:08:31.380117 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:31.380069 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:31.380117 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:31.380124 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:31.384748 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:31.384721 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:31.984353 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:31.984321 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:08:51.027610 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:08:51.027579 2543 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 10:09:21.685217 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.685180 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw"] Apr 21 10:09:21.688143 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.688127 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.690800 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.690777 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 10:09:21.690907 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.690832 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 10:09:21.691688 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.691663 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wcszx\"" Apr 21 10:09:21.699220 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.699197 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw"] Apr 21 10:09:21.736691 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.736662 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.736819 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.736693 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlzh\" (UniqueName: \"kubernetes.io/projected/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-kube-api-access-9dlzh\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.736819 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.736731 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.837012 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.836981 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.837012 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.837015 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlzh\" (UniqueName: \"kubernetes.io/projected/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-kube-api-access-9dlzh\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.837244 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.837034 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.837336 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.837315 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.837391 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.837333 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.846451 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.846419 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlzh\" (UniqueName: \"kubernetes.io/projected/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-kube-api-access-9dlzh\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:21.999412 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:21.999336 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:22.132704 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:22.132669 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw"] Apr 21 10:09:22.135998 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:09:22.135969 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd165cb7_d4d3_4e8c_8cee_5171ca6cb6d4.slice/crio-079e0fdb45d3488f30cd10395c95566a6308a0047189261ec226bf9fd2131358 WatchSource:0}: Error finding container 079e0fdb45d3488f30cd10395c95566a6308a0047189261ec226bf9fd2131358: Status 404 returned error can't find the container with id 079e0fdb45d3488f30cd10395c95566a6308a0047189261ec226bf9fd2131358 Apr 21 10:09:22.137802 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:22.137786 2543 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:09:23.108968 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:23.108931 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" event={"ID":"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4","Type":"ContainerStarted","Data":"079e0fdb45d3488f30cd10395c95566a6308a0047189261ec226bf9fd2131358"} Apr 21 10:09:28.129416 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:28.129323 2543 generic.go:358] "Generic (PLEG): container finished" podID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerID="add54b7e3fa28ae5c5a9a0ce25f147834bd3f18b519ff22498535066dd69772e" exitCode=0 Apr 21 10:09:28.129416 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:28.129392 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" event={"ID":"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4","Type":"ContainerDied","Data":"add54b7e3fa28ae5c5a9a0ce25f147834bd3f18b519ff22498535066dd69772e"} Apr 21 10:09:31.138894 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:31.138860 2543 generic.go:358] "Generic (PLEG): container finished" podID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerID="544c1e20809be63a77c7d9123242376fd813ab739d2a282b984ba117857fe48f" exitCode=0 Apr 21 10:09:31.138894 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:31.138897 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" event={"ID":"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4","Type":"ContainerDied","Data":"544c1e20809be63a77c7d9123242376fd813ab739d2a282b984ba117857fe48f"} Apr 21 10:09:39.162339 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:39.162300 2543 generic.go:358] "Generic (PLEG): container finished" podID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerID="3bae9ac0dfe8113db66cb8f91339fa8cf5045040e1da5aac7405e74d27a73ab6" exitCode=0 Apr 21 10:09:39.162732 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:39.162374 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" event={"ID":"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4","Type":"ContainerDied","Data":"3bae9ac0dfe8113db66cb8f91339fa8cf5045040e1da5aac7405e74d27a73ab6"} Apr 21 10:09:40.278951 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.278921 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:40.372333 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.372307 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-bundle\") pod \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " Apr 21 10:09:40.372486 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.372342 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-util\") pod \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " Apr 21 10:09:40.372486 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.372370 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dlzh\" (UniqueName: \"kubernetes.io/projected/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-kube-api-access-9dlzh\") pod \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\" (UID: \"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4\") " Apr 21 10:09:40.372841 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.372819 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-bundle" (OuterVolumeSpecName: "bundle") pod "dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" (UID: "dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:09:40.374752 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.374724 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-kube-api-access-9dlzh" (OuterVolumeSpecName: "kube-api-access-9dlzh") pod "dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" (UID: "dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4"). InnerVolumeSpecName "kube-api-access-9dlzh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:09:40.376111 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.376091 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-util" (OuterVolumeSpecName: "util") pod "dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" (UID: "dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:09:40.472999 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.472920 2543 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-util\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:09:40.472999 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.472957 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9dlzh\" (UniqueName: \"kubernetes.io/projected/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-kube-api-access-9dlzh\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:09:40.472999 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:40.472968 2543 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:09:41.168777 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:41.168748 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" event={"ID":"dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4","Type":"ContainerDied","Data":"079e0fdb45d3488f30cd10395c95566a6308a0047189261ec226bf9fd2131358"} Apr 21 10:09:41.168777 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:41.168781 2543 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="079e0fdb45d3488f30cd10395c95566a6308a0047189261ec226bf9fd2131358" Apr 21 10:09:41.168943 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:41.168760 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cbnpkw" Apr 21 10:09:48.598737 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.598701 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-nnlg7"] Apr 21 10:09:48.599229 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.598951 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerName="util" Apr 21 10:09:48.599229 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.598961 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerName="util" Apr 21 10:09:48.599229 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.598971 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerName="extract" Apr 21 10:09:48.599229 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.598977 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerName="extract" Apr 21 10:09:48.599229 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.598995 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerName="pull" Apr 21 10:09:48.599229 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.599000 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerName="pull" Apr 21 10:09:48.599229 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.599035 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd165cb7-d4d3-4e8c-8cee-5171ca6cb6d4" containerName="extract" Apr 21 10:09:48.652519 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.652490 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nnlg7"] Apr 21 10:09:48.652683 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.652534 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:48.655501 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.655483 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 21 10:09:48.656138 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.656123 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-ggxhf\"" Apr 21 10:09:48.656594 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.656578 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 21 10:09:48.661524 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.661504 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 21 10:09:48.672323 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.672307 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 21 10:09:48.731684 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.731662 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5e231ed-114e-4cc1-8400-471235f6a870-certificates\") pod \"keda-admission-cf49989db-nnlg7\" (UID: \"b5e231ed-114e-4cc1-8400-471235f6a870\") " pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:48.731794 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.731722 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkb8b\" (UniqueName: \"kubernetes.io/projected/b5e231ed-114e-4cc1-8400-471235f6a870-kube-api-access-gkb8b\") pod \"keda-admission-cf49989db-nnlg7\" (UID: \"b5e231ed-114e-4cc1-8400-471235f6a870\") " pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:48.832593 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.832567 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5e231ed-114e-4cc1-8400-471235f6a870-certificates\") pod \"keda-admission-cf49989db-nnlg7\" (UID: \"b5e231ed-114e-4cc1-8400-471235f6a870\") " pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:48.832724 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.832637 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkb8b\" (UniqueName: \"kubernetes.io/projected/b5e231ed-114e-4cc1-8400-471235f6a870-kube-api-access-gkb8b\") pod \"keda-admission-cf49989db-nnlg7\" (UID: \"b5e231ed-114e-4cc1-8400-471235f6a870\") " pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:48.832724 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:09:48.832651 2543 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 21 10:09:48.832724 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:09:48.832676 2543 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-nnlg7: secret "keda-admission-webhooks-certs" not found Apr 21 10:09:48.832825 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:09:48.832741 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5e231ed-114e-4cc1-8400-471235f6a870-certificates podName:b5e231ed-114e-4cc1-8400-471235f6a870 nodeName:}" failed. No retries permitted until 2026-04-21 10:09:49.332724807 +0000 UTC m=+358.735648743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b5e231ed-114e-4cc1-8400-471235f6a870-certificates") pod "keda-admission-cf49989db-nnlg7" (UID: "b5e231ed-114e-4cc1-8400-471235f6a870") : secret "keda-admission-webhooks-certs" not found Apr 21 10:09:48.848017 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:48.847992 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkb8b\" (UniqueName: \"kubernetes.io/projected/b5e231ed-114e-4cc1-8400-471235f6a870-kube-api-access-gkb8b\") pod \"keda-admission-cf49989db-nnlg7\" (UID: \"b5e231ed-114e-4cc1-8400-471235f6a870\") " pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:49.336537 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:49.336495 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5e231ed-114e-4cc1-8400-471235f6a870-certificates\") pod \"keda-admission-cf49989db-nnlg7\" (UID: \"b5e231ed-114e-4cc1-8400-471235f6a870\") " pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:49.339021 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:49.338997 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5e231ed-114e-4cc1-8400-471235f6a870-certificates\") pod \"keda-admission-cf49989db-nnlg7\" (UID: \"b5e231ed-114e-4cc1-8400-471235f6a870\") " pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:49.562745 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:49.562706 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:49.692788 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:49.692706 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nnlg7"] Apr 21 10:09:49.695379 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:09:49.695352 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5e231ed_114e_4cc1_8400_471235f6a870.slice/crio-93d6ea1b74982b79b2b1c40e99b1c16425118162b347e32ef079539244c127dc WatchSource:0}: Error finding container 93d6ea1b74982b79b2b1c40e99b1c16425118162b347e32ef079539244c127dc: Status 404 returned error can't find the container with id 93d6ea1b74982b79b2b1c40e99b1c16425118162b347e32ef079539244c127dc Apr 21 10:09:50.194464 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:50.194433 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nnlg7" event={"ID":"b5e231ed-114e-4cc1-8400-471235f6a870","Type":"ContainerStarted","Data":"93d6ea1b74982b79b2b1c40e99b1c16425118162b347e32ef079539244c127dc"} Apr 21 10:09:51.198773 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:51.198686 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nnlg7" event={"ID":"b5e231ed-114e-4cc1-8400-471235f6a870","Type":"ContainerStarted","Data":"6ed63ecf437b706ef28159b309019563101136af7b0d5096a5ca538afb70164a"} Apr 21 10:09:51.199110 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:51.198798 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:09:51.218169 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:09:51.218119 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-nnlg7" podStartSLOduration=1.965303473 podStartE2EDuration="3.218106966s" podCreationTimestamp="2026-04-21 10:09:48 +0000 UTC" firstStartedPulling="2026-04-21 10:09:49.697191801 +0000 UTC m=+359.100115750" lastFinishedPulling="2026-04-21 10:09:50.949995303 +0000 UTC m=+360.352919243" observedRunningTime="2026-04-21 10:09:51.217004227 +0000 UTC m=+360.619928184" watchObservedRunningTime="2026-04-21 10:09:51.218106966 +0000 UTC m=+360.621030922" Apr 21 10:10:12.204110 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:12.204078 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-nnlg7" Apr 21 10:10:41.745927 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:41.745892 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6"] Apr 21 10:10:41.747949 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:41.747933 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:41.750601 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:41.750578 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 10:10:41.751654 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:41.751629 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wcszx\"" Apr 21 10:10:41.751654 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:41.751652 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 10:10:41.756974 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:41.756945 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6"] Apr 21 10:10:41.906242 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:41.906195 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:41.906427 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:41.906277 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:41.906427 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:41.906314 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mtk8\" (UniqueName: \"kubernetes.io/projected/345c1fc4-feb8-485a-aaad-b235c8cb27fe-kube-api-access-4mtk8\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:42.007022 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.006939 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:42.007022 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.006981 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mtk8\" (UniqueName: \"kubernetes.io/projected/345c1fc4-feb8-485a-aaad-b235c8cb27fe-kube-api-access-4mtk8\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:42.007022 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.007013 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:42.007306 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.007285 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:42.007372 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.007310 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:42.015973 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.015944 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mtk8\" (UniqueName: \"kubernetes.io/projected/345c1fc4-feb8-485a-aaad-b235c8cb27fe-kube-api-access-4mtk8\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:42.056940 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.056915 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:42.190763 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.190732 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6"] Apr 21 10:10:42.193906 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:10:42.193881 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod345c1fc4_feb8_485a_aaad_b235c8cb27fe.slice/crio-c7a16e33a693a56c349966a100ae479c6cfeb7942f85cc51a4423136368be4f1 WatchSource:0}: Error finding container c7a16e33a693a56c349966a100ae479c6cfeb7942f85cc51a4423136368be4f1: Status 404 returned error can't find the container with id c7a16e33a693a56c349966a100ae479c6cfeb7942f85cc51a4423136368be4f1 Apr 21 10:10:42.326978 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.326945 2543 generic.go:358] "Generic (PLEG): container finished" podID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerID="45cd628368968e9997f92b6274e96f753c7dc81a8a6ed7c0d51530c712e2e485" exitCode=0 Apr 21 10:10:42.327115 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.326988 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" event={"ID":"345c1fc4-feb8-485a-aaad-b235c8cb27fe","Type":"ContainerDied","Data":"45cd628368968e9997f92b6274e96f753c7dc81a8a6ed7c0d51530c712e2e485"} Apr 21 10:10:42.327115 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:42.327013 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" event={"ID":"345c1fc4-feb8-485a-aaad-b235c8cb27fe","Type":"ContainerStarted","Data":"c7a16e33a693a56c349966a100ae479c6cfeb7942f85cc51a4423136368be4f1"} Apr 21 10:10:44.334046 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:44.334001 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" event={"ID":"345c1fc4-feb8-485a-aaad-b235c8cb27fe","Type":"ContainerStarted","Data":"08058b36838046e3e6267bb2d2a84aa6b210743d490f5404b92700e6e19a8934"} Apr 21 10:10:45.338404 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:45.338364 2543 generic.go:358] "Generic (PLEG): container finished" podID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerID="08058b36838046e3e6267bb2d2a84aa6b210743d490f5404b92700e6e19a8934" exitCode=0 Apr 21 10:10:45.338808 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:45.338423 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" event={"ID":"345c1fc4-feb8-485a-aaad-b235c8cb27fe","Type":"ContainerDied","Data":"08058b36838046e3e6267bb2d2a84aa6b210743d490f5404b92700e6e19a8934"} Apr 21 10:10:46.343189 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:46.343146 2543 generic.go:358] "Generic (PLEG): container finished" podID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerID="f94c7aaef594229de7508ee93ddea433bfb5abb2323d8ae039e9d4cca17d0d59" exitCode=0 Apr 21 10:10:46.343591 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:46.343207 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" event={"ID":"345c1fc4-feb8-485a-aaad-b235c8cb27fe","Type":"ContainerDied","Data":"f94c7aaef594229de7508ee93ddea433bfb5abb2323d8ae039e9d4cca17d0d59"} Apr 21 10:10:47.465526 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.465505 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:10:47.654753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.654681 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mtk8\" (UniqueName: \"kubernetes.io/projected/345c1fc4-feb8-485a-aaad-b235c8cb27fe-kube-api-access-4mtk8\") pod \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " Apr 21 10:10:47.654753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.654723 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-util\") pod \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " Apr 21 10:10:47.654921 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.654758 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-bundle\") pod \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\" (UID: \"345c1fc4-feb8-485a-aaad-b235c8cb27fe\") " Apr 21 10:10:47.655452 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.655420 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-bundle" (OuterVolumeSpecName: "bundle") pod "345c1fc4-feb8-485a-aaad-b235c8cb27fe" (UID: "345c1fc4-feb8-485a-aaad-b235c8cb27fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:10:47.656757 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.656737 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/345c1fc4-feb8-485a-aaad-b235c8cb27fe-kube-api-access-4mtk8" (OuterVolumeSpecName: "kube-api-access-4mtk8") pod "345c1fc4-feb8-485a-aaad-b235c8cb27fe" (UID: "345c1fc4-feb8-485a-aaad-b235c8cb27fe"). InnerVolumeSpecName "kube-api-access-4mtk8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:10:47.659291 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.659245 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-util" (OuterVolumeSpecName: "util") pod "345c1fc4-feb8-485a-aaad-b235c8cb27fe" (UID: "345c1fc4-feb8-485a-aaad-b235c8cb27fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:10:47.756226 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.756193 2543 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:10:47.756226 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.756224 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mtk8\" (UniqueName: \"kubernetes.io/projected/345c1fc4-feb8-485a-aaad-b235c8cb27fe-kube-api-access-4mtk8\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:10:47.756405 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:47.756241 2543 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/345c1fc4-feb8-485a-aaad-b235c8cb27fe-util\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:10:48.350477 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:48.350435 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" event={"ID":"345c1fc4-feb8-485a-aaad-b235c8cb27fe","Type":"ContainerDied","Data":"c7a16e33a693a56c349966a100ae479c6cfeb7942f85cc51a4423136368be4f1"} Apr 21 10:10:48.350477 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:48.350475 2543 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7a16e33a693a56c349966a100ae479c6cfeb7942f85cc51a4423136368be4f1" Apr 21 10:10:48.350699 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:10:48.350478 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djtrb6" Apr 21 10:11:03.455003 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.454969 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-fn2tz"] Apr 21 10:11:03.455414 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.455226 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerName="pull" Apr 21 10:11:03.455414 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.455237 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerName="pull" Apr 21 10:11:03.455414 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.455247 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerName="extract" Apr 21 10:11:03.455414 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.455252 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerName="extract" Apr 21 10:11:03.455414 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.455260 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerName="util" Apr 21 10:11:03.455414 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.455265 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerName="util" Apr 21 10:11:03.455414 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.455304 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="345c1fc4-feb8-485a-aaad-b235c8cb27fe" containerName="extract" Apr 21 10:11:03.461734 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.461718 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" Apr 21 10:11:03.464885 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.464866 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 10:11:03.465519 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.465504 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 10:11:03.465937 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.465923 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-wklc5\"" Apr 21 10:11:03.474107 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.474085 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-fn2tz"] Apr 21 10:11:03.561728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.561688 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzz4x\" (UniqueName: \"kubernetes.io/projected/924cef49-eb6b-4025-88ba-a5b3b20f9905-kube-api-access-tzz4x\") pod \"cert-manager-cainjector-68b757865b-fn2tz\" (UID: \"924cef49-eb6b-4025-88ba-a5b3b20f9905\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" Apr 21 10:11:03.561892 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.561798 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/924cef49-eb6b-4025-88ba-a5b3b20f9905-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-fn2tz\" (UID: \"924cef49-eb6b-4025-88ba-a5b3b20f9905\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" Apr 21 10:11:03.662605 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.662574 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/924cef49-eb6b-4025-88ba-a5b3b20f9905-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-fn2tz\" (UID: \"924cef49-eb6b-4025-88ba-a5b3b20f9905\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" Apr 21 10:11:03.662762 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.662612 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzz4x\" (UniqueName: \"kubernetes.io/projected/924cef49-eb6b-4025-88ba-a5b3b20f9905-kube-api-access-tzz4x\") pod \"cert-manager-cainjector-68b757865b-fn2tz\" (UID: \"924cef49-eb6b-4025-88ba-a5b3b20f9905\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" Apr 21 10:11:03.671415 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.671388 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/924cef49-eb6b-4025-88ba-a5b3b20f9905-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-fn2tz\" (UID: \"924cef49-eb6b-4025-88ba-a5b3b20f9905\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" Apr 21 10:11:03.671519 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.671431 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzz4x\" (UniqueName: \"kubernetes.io/projected/924cef49-eb6b-4025-88ba-a5b3b20f9905-kube-api-access-tzz4x\") pod \"cert-manager-cainjector-68b757865b-fn2tz\" (UID: \"924cef49-eb6b-4025-88ba-a5b3b20f9905\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" Apr 21 10:11:03.771149 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.771077 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" Apr 21 10:11:03.856339 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.856311 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz"] Apr 21 10:11:03.861213 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.861192 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:03.863935 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.863819 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wcszx\"" Apr 21 10:11:03.866489 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.864623 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 10:11:03.866489 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.864644 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 10:11:03.866489 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.865801 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q4t5\" (UniqueName: \"kubernetes.io/projected/dccfacc7-b554-4ddb-b543-d300191e8b6a-kube-api-access-8q4t5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:03.866489 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.865862 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:03.866489 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.865917 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:03.868905 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.868881 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz"] Apr 21 10:11:03.891041 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.891014 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-fn2tz"] Apr 21 10:11:03.893758 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:11:03.893727 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod924cef49_eb6b_4025_88ba_a5b3b20f9905.slice/crio-174a0a96bb9e3e9b3540582f863084652e5f52005ff51a354f9ca51ea90fd65d WatchSource:0}: Error finding container 174a0a96bb9e3e9b3540582f863084652e5f52005ff51a354f9ca51ea90fd65d: Status 404 returned error can't find the container with id 174a0a96bb9e3e9b3540582f863084652e5f52005ff51a354f9ca51ea90fd65d Apr 21 10:11:03.966373 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.966328 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q4t5\" (UniqueName: \"kubernetes.io/projected/dccfacc7-b554-4ddb-b543-d300191e8b6a-kube-api-access-8q4t5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:03.966373 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.966383 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:03.966629 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.966415 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:03.966751 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.966734 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:03.966822 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.966800 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:03.975525 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:03.975504 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q4t5\" (UniqueName: \"kubernetes.io/projected/dccfacc7-b554-4ddb-b543-d300191e8b6a-kube-api-access-8q4t5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:04.171744 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:04.171702 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:04.291600 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:04.291569 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz"] Apr 21 10:11:04.294186 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:11:04.294161 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddccfacc7_b554_4ddb_b543_d300191e8b6a.slice/crio-acc03d30ef8c428b9389db06bc69c7f8c5ed5a799ad0547c56f1044cde98cc67 WatchSource:0}: Error finding container acc03d30ef8c428b9389db06bc69c7f8c5ed5a799ad0547c56f1044cde98cc67: Status 404 returned error can't find the container with id acc03d30ef8c428b9389db06bc69c7f8c5ed5a799ad0547c56f1044cde98cc67 Apr 21 10:11:04.393764 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:04.393703 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" event={"ID":"dccfacc7-b554-4ddb-b543-d300191e8b6a","Type":"ContainerStarted","Data":"07b0fa7a7b6abb23dca104e5fe616e13014e453fd59e960c500867da99e851da"} Apr 21 10:11:04.393922 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:04.393775 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" event={"ID":"dccfacc7-b554-4ddb-b543-d300191e8b6a","Type":"ContainerStarted","Data":"acc03d30ef8c428b9389db06bc69c7f8c5ed5a799ad0547c56f1044cde98cc67"} Apr 21 10:11:04.395115 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:04.395090 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" event={"ID":"924cef49-eb6b-4025-88ba-a5b3b20f9905","Type":"ContainerStarted","Data":"174a0a96bb9e3e9b3540582f863084652e5f52005ff51a354f9ca51ea90fd65d"} Apr 21 10:11:05.400111 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:05.400075 2543 generic.go:358] "Generic (PLEG): container finished" podID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerID="07b0fa7a7b6abb23dca104e5fe616e13014e453fd59e960c500867da99e851da" exitCode=0 Apr 21 10:11:05.400560 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:05.400148 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" event={"ID":"dccfacc7-b554-4ddb-b543-d300191e8b6a","Type":"ContainerDied","Data":"07b0fa7a7b6abb23dca104e5fe616e13014e453fd59e960c500867da99e851da"} Apr 21 10:11:06.405880 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:06.405836 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" event={"ID":"924cef49-eb6b-4025-88ba-a5b3b20f9905","Type":"ContainerStarted","Data":"7c1dbfc566c6be340dee094d8c56f024b27fb1c1ebd8cebde36f73860a94b604"} Apr 21 10:11:06.428665 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:06.428571 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-fn2tz" podStartSLOduration=1.151694596 podStartE2EDuration="3.428535152s" podCreationTimestamp="2026-04-21 10:11:03 +0000 UTC" firstStartedPulling="2026-04-21 10:11:03.895481398 +0000 UTC m=+433.298405332" lastFinishedPulling="2026-04-21 10:11:06.172321933 +0000 UTC m=+435.575245888" observedRunningTime="2026-04-21 10:11:06.427895359 +0000 UTC m=+435.830819334" watchObservedRunningTime="2026-04-21 10:11:06.428535152 +0000 UTC m=+435.831459110" Apr 21 10:11:08.413179 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:08.413146 2543 generic.go:358] "Generic (PLEG): container finished" podID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerID="d37d6ecf882647de4f713ea98b0ea93abe0e4fac4a42d8319e39b84a6559e2bf" exitCode=0 Apr 21 10:11:08.413527 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:08.413213 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" event={"ID":"dccfacc7-b554-4ddb-b543-d300191e8b6a","Type":"ContainerDied","Data":"d37d6ecf882647de4f713ea98b0ea93abe0e4fac4a42d8319e39b84a6559e2bf"} Apr 21 10:11:09.417745 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:09.417713 2543 generic.go:358] "Generic (PLEG): container finished" podID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerID="0ac793ea7961b09e29a82f5904545cc864597ac4b8221fff2bc6209edb0dcefe" exitCode=0 Apr 21 10:11:09.418099 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:09.417804 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" event={"ID":"dccfacc7-b554-4ddb-b543-d300191e8b6a","Type":"ContainerDied","Data":"0ac793ea7961b09e29a82f5904545cc864597ac4b8221fff2bc6209edb0dcefe"} Apr 21 10:11:10.533874 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.533851 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:10.621360 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.621322 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q4t5\" (UniqueName: \"kubernetes.io/projected/dccfacc7-b554-4ddb-b543-d300191e8b6a-kube-api-access-8q4t5\") pod \"dccfacc7-b554-4ddb-b543-d300191e8b6a\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " Apr 21 10:11:10.621586 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.621425 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-util\") pod \"dccfacc7-b554-4ddb-b543-d300191e8b6a\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " Apr 21 10:11:10.621586 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.621465 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-bundle\") pod \"dccfacc7-b554-4ddb-b543-d300191e8b6a\" (UID: \"dccfacc7-b554-4ddb-b543-d300191e8b6a\") " Apr 21 10:11:10.621864 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.621831 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-bundle" (OuterVolumeSpecName: "bundle") pod "dccfacc7-b554-4ddb-b543-d300191e8b6a" (UID: "dccfacc7-b554-4ddb-b543-d300191e8b6a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:11:10.623562 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.623524 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dccfacc7-b554-4ddb-b543-d300191e8b6a-kube-api-access-8q4t5" (OuterVolumeSpecName: "kube-api-access-8q4t5") pod "dccfacc7-b554-4ddb-b543-d300191e8b6a" (UID: "dccfacc7-b554-4ddb-b543-d300191e8b6a"). InnerVolumeSpecName "kube-api-access-8q4t5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:11:10.628278 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.628254 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-util" (OuterVolumeSpecName: "util") pod "dccfacc7-b554-4ddb-b543-d300191e8b6a" (UID: "dccfacc7-b554-4ddb-b543-d300191e8b6a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:11:10.722888 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.722799 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8q4t5\" (UniqueName: \"kubernetes.io/projected/dccfacc7-b554-4ddb-b543-d300191e8b6a-kube-api-access-8q4t5\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:11:10.722888 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.722837 2543 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-util\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:11:10.722888 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:10.722847 2543 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dccfacc7-b554-4ddb-b543-d300191e8b6a-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:11:11.425582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:11.425475 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" event={"ID":"dccfacc7-b554-4ddb-b543-d300191e8b6a","Type":"ContainerDied","Data":"acc03d30ef8c428b9389db06bc69c7f8c5ed5a799ad0547c56f1044cde98cc67"} Apr 21 10:11:11.425582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:11.425497 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhqxhz" Apr 21 10:11:11.425582 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:11.425509 2543 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc03d30ef8c428b9389db06bc69c7f8c5ed5a799ad0547c56f1044cde98cc67" Apr 21 10:11:34.261492 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.261058 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55"] Apr 21 10:11:34.261492 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.261398 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerName="extract" Apr 21 10:11:34.261492 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.261414 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerName="extract" Apr 21 10:11:34.261492 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.261443 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerName="pull" Apr 21 10:11:34.261492 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.261452 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerName="pull" Apr 21 10:11:34.262349 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.261464 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerName="util" Apr 21 10:11:34.262349 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.261574 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerName="util" Apr 21 10:11:34.262349 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.261644 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="dccfacc7-b554-4ddb-b543-d300191e8b6a" containerName="extract" Apr 21 10:11:34.268665 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.268634 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55"] Apr 21 10:11:34.268800 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.268751 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.271276 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.271249 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 10:11:34.271419 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.271337 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wcszx\"" Apr 21 10:11:34.272319 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.272297 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 10:11:34.391197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.391160 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnnx2\" (UniqueName: \"kubernetes.io/projected/199b2880-60c1-441f-869d-29961d46b900-kube-api-access-tnnx2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.391197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.391203 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.391405 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.391226 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.491560 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.491515 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnx2\" (UniqueName: \"kubernetes.io/projected/199b2880-60c1-441f-869d-29961d46b900-kube-api-access-tnnx2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.491733 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.491579 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.491733 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.491606 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.491972 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.491951 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.492042 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.491988 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.500031 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.499998 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnnx2\" (UniqueName: \"kubernetes.io/projected/199b2880-60c1-441f-869d-29961d46b900-kube-api-access-tnnx2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.577789 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.577722 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:34.697176 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:34.697149 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55"] Apr 21 10:11:34.699582 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:11:34.699538 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199b2880_60c1_441f_869d_29961d46b900.slice/crio-f1a6e68f90764d41a7435eb9fa908dd898d8b9723878a66a6bbaa71f11b325b2 WatchSource:0}: Error finding container f1a6e68f90764d41a7435eb9fa908dd898d8b9723878a66a6bbaa71f11b325b2: Status 404 returned error can't find the container with id f1a6e68f90764d41a7435eb9fa908dd898d8b9723878a66a6bbaa71f11b325b2 Apr 21 10:11:35.496726 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:35.496687 2543 generic.go:358] "Generic (PLEG): container finished" podID="199b2880-60c1-441f-869d-29961d46b900" containerID="5bdbe864f824cc7296d3fde263990c4a5570df25ccbf663d21ba4a147ca80816" exitCode=0 Apr 21 10:11:35.497153 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:35.496735 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" event={"ID":"199b2880-60c1-441f-869d-29961d46b900","Type":"ContainerDied","Data":"5bdbe864f824cc7296d3fde263990c4a5570df25ccbf663d21ba4a147ca80816"} Apr 21 10:11:35.497153 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:35.496758 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" event={"ID":"199b2880-60c1-441f-869d-29961d46b900","Type":"ContainerStarted","Data":"f1a6e68f90764d41a7435eb9fa908dd898d8b9723878a66a6bbaa71f11b325b2"} Apr 21 10:11:37.505648 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:37.505608 2543 generic.go:358] "Generic (PLEG): container finished" podID="199b2880-60c1-441f-869d-29961d46b900" containerID="216e0a6c5beeb52e8f58ca95fb3a90715dbffe702a3464f2e16ed54c2d75df31" exitCode=0 Apr 21 10:11:37.506117 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:37.505670 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" event={"ID":"199b2880-60c1-441f-869d-29961d46b900","Type":"ContainerDied","Data":"216e0a6c5beeb52e8f58ca95fb3a90715dbffe702a3464f2e16ed54c2d75df31"} Apr 21 10:11:38.510765 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:38.510733 2543 generic.go:358] "Generic (PLEG): container finished" podID="199b2880-60c1-441f-869d-29961d46b900" containerID="4fa1d144c367ab785a99b999fbe41a51c8a0401f0800143ac7968bcc54b79747" exitCode=0 Apr 21 10:11:38.511137 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:38.510826 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" event={"ID":"199b2880-60c1-441f-869d-29961d46b900","Type":"ContainerDied","Data":"4fa1d144c367ab785a99b999fbe41a51c8a0401f0800143ac7968bcc54b79747"} Apr 21 10:11:39.630599 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.630577 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:39.735481 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.735454 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-util\") pod \"199b2880-60c1-441f-869d-29961d46b900\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " Apr 21 10:11:39.735644 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.735537 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnnx2\" (UniqueName: \"kubernetes.io/projected/199b2880-60c1-441f-869d-29961d46b900-kube-api-access-tnnx2\") pod \"199b2880-60c1-441f-869d-29961d46b900\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " Apr 21 10:11:39.735644 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.735597 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-bundle\") pod \"199b2880-60c1-441f-869d-29961d46b900\" (UID: \"199b2880-60c1-441f-869d-29961d46b900\") " Apr 21 10:11:39.736421 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.736394 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-bundle" (OuterVolumeSpecName: "bundle") pod "199b2880-60c1-441f-869d-29961d46b900" (UID: "199b2880-60c1-441f-869d-29961d46b900"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:11:39.737764 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.737729 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199b2880-60c1-441f-869d-29961d46b900-kube-api-access-tnnx2" (OuterVolumeSpecName: "kube-api-access-tnnx2") pod "199b2880-60c1-441f-869d-29961d46b900" (UID: "199b2880-60c1-441f-869d-29961d46b900"). InnerVolumeSpecName "kube-api-access-tnnx2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:11:39.741235 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.741202 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-util" (OuterVolumeSpecName: "util") pod "199b2880-60c1-441f-869d-29961d46b900" (UID: "199b2880-60c1-441f-869d-29961d46b900"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:11:39.836243 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.836167 2543 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-util\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:11:39.836243 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.836197 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tnnx2\" (UniqueName: \"kubernetes.io/projected/199b2880-60c1-441f-869d-29961d46b900-kube-api-access-tnnx2\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:11:39.836243 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:39.836209 2543 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199b2880-60c1-441f-869d-29961d46b900-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:11:40.518243 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:40.518204 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" event={"ID":"199b2880-60c1-441f-869d-29961d46b900","Type":"ContainerDied","Data":"f1a6e68f90764d41a7435eb9fa908dd898d8b9723878a66a6bbaa71f11b325b2"} Apr 21 10:11:40.518243 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:40.518237 2543 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1a6e68f90764d41a7435eb9fa908dd898d8b9723878a66a6bbaa71f11b325b2" Apr 21 10:11:40.518243 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:40.518243 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358rv55" Apr 21 10:11:42.877260 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.877228 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4"] Apr 21 10:11:42.877678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.877513 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="199b2880-60c1-441f-869d-29961d46b900" containerName="util" Apr 21 10:11:42.877678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.877527 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="199b2880-60c1-441f-869d-29961d46b900" containerName="util" Apr 21 10:11:42.877678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.877560 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="199b2880-60c1-441f-869d-29961d46b900" containerName="pull" Apr 21 10:11:42.877678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.877567 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="199b2880-60c1-441f-869d-29961d46b900" containerName="pull" Apr 21 10:11:42.877678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.877573 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="199b2880-60c1-441f-869d-29961d46b900" containerName="extract" Apr 21 10:11:42.877678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.877579 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="199b2880-60c1-441f-869d-29961d46b900" containerName="extract" Apr 21 10:11:42.877678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.877621 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="199b2880-60c1-441f-869d-29961d46b900" containerName="extract" Apr 21 10:11:42.880423 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.880409 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:42.884254 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.884228 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 10:11:42.884365 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.884330 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:11:42.884365 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.884330 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 10:11:42.885336 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.885314 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 10:11:42.885397 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.885321 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 10:11:42.885488 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.885390 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lmcjb\"" Apr 21 10:11:42.893248 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.893223 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4"] Apr 21 10:11:42.960753 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.960724 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/49e18b7b-5798-414b-9c17-d0c5d24a8544-metrics-cert\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:42.960924 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.960768 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/49e18b7b-5798-414b-9c17-d0c5d24a8544-manager-config\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:42.960924 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.960786 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49e18b7b-5798-414b-9c17-d0c5d24a8544-cert\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:42.960924 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:42.960885 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cbn8\" (UniqueName: \"kubernetes.io/projected/49e18b7b-5798-414b-9c17-d0c5d24a8544-kube-api-access-8cbn8\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.062196 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.062156 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cbn8\" (UniqueName: \"kubernetes.io/projected/49e18b7b-5798-414b-9c17-d0c5d24a8544-kube-api-access-8cbn8\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.062404 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.062208 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/49e18b7b-5798-414b-9c17-d0c5d24a8544-metrics-cert\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.062404 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.062241 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/49e18b7b-5798-414b-9c17-d0c5d24a8544-manager-config\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.062404 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.062259 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49e18b7b-5798-414b-9c17-d0c5d24a8544-cert\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.062890 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.062870 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/49e18b7b-5798-414b-9c17-d0c5d24a8544-manager-config\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.064920 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.064898 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/49e18b7b-5798-414b-9c17-d0c5d24a8544-metrics-cert\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.064998 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.064949 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49e18b7b-5798-414b-9c17-d0c5d24a8544-cert\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.073581 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.073560 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cbn8\" (UniqueName: \"kubernetes.io/projected/49e18b7b-5798-414b-9c17-d0c5d24a8544-kube-api-access-8cbn8\") pod \"lws-controller-manager-5bbdf94c78-5h9l4\" (UID: \"49e18b7b-5798-414b-9c17-d0c5d24a8544\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.189506 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.189419 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:43.308801 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.308773 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4"] Apr 21 10:11:43.311000 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:11:43.310974 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49e18b7b_5798_414b_9c17_d0c5d24a8544.slice/crio-ad9122c11fd42039f452b3c022bff46ff798506c0ab144bea9df840d585c8333 WatchSource:0}: Error finding container ad9122c11fd42039f452b3c022bff46ff798506c0ab144bea9df840d585c8333: Status 404 returned error can't find the container with id ad9122c11fd42039f452b3c022bff46ff798506c0ab144bea9df840d585c8333 Apr 21 10:11:43.528705 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:43.528623 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" event={"ID":"49e18b7b-5798-414b-9c17-d0c5d24a8544","Type":"ContainerStarted","Data":"ad9122c11fd42039f452b3c022bff46ff798506c0ab144bea9df840d585c8333"} Apr 21 10:11:45.535591 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:45.535540 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" event={"ID":"49e18b7b-5798-414b-9c17-d0c5d24a8544","Type":"ContainerStarted","Data":"f699ed8a39b978a3d5b7a5ef17f4594b8602268450d876bb56cf8d03a966d1ce"} Apr 21 10:11:45.535966 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:45.535723 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:11:45.552986 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:45.552937 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" podStartSLOduration=2.072772721 podStartE2EDuration="3.552925135s" podCreationTimestamp="2026-04-21 10:11:42 +0000 UTC" firstStartedPulling="2026-04-21 10:11:43.312721631 +0000 UTC m=+472.715645565" lastFinishedPulling="2026-04-21 10:11:44.792874025 +0000 UTC m=+474.195797979" observedRunningTime="2026-04-21 10:11:45.551436161 +0000 UTC m=+474.954360132" watchObservedRunningTime="2026-04-21 10:11:45.552925135 +0000 UTC m=+474.955849093" Apr 21 10:11:49.118390 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.118359 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2"] Apr 21 10:11:49.121530 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.121514 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.124680 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.124659 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wcszx\"" Apr 21 10:11:49.125195 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.125175 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 10:11:49.125678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.125661 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 10:11:49.135565 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.135522 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2"] Apr 21 10:11:49.212176 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.212145 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwvwm\" (UniqueName: \"kubernetes.io/projected/fe0708a0-ee89-49ac-a33a-3b807f012459-kube-api-access-cwvwm\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.212315 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.212187 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.212315 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.212214 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.313101 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.313070 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.313245 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.313115 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.313245 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.313181 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwvwm\" (UniqueName: \"kubernetes.io/projected/fe0708a0-ee89-49ac-a33a-3b807f012459-kube-api-access-cwvwm\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.313491 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.313473 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.313628 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.313612 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.321959 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.321937 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwvwm\" (UniqueName: \"kubernetes.io/projected/fe0708a0-ee89-49ac-a33a-3b807f012459-kube-api-access-cwvwm\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.430465 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.430380 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:49.594532 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:49.594509 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2"] Apr 21 10:11:49.596219 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:11:49.596192 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe0708a0_ee89_49ac_a33a_3b807f012459.slice/crio-47869dff1f3aab28df0f7881ce2aeda2382f6796fa4831328c6fbb7a3793118c WatchSource:0}: Error finding container 47869dff1f3aab28df0f7881ce2aeda2382f6796fa4831328c6fbb7a3793118c: Status 404 returned error can't find the container with id 47869dff1f3aab28df0f7881ce2aeda2382f6796fa4831328c6fbb7a3793118c Apr 21 10:11:50.551102 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:50.551063 2543 generic.go:358] "Generic (PLEG): container finished" podID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerID="e6d48fd31768d7e2047d9af0464e3243f368656ce2201fd3cf79b151d14b7e10" exitCode=0 Apr 21 10:11:50.551446 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:50.551142 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" event={"ID":"fe0708a0-ee89-49ac-a33a-3b807f012459","Type":"ContainerDied","Data":"e6d48fd31768d7e2047d9af0464e3243f368656ce2201fd3cf79b151d14b7e10"} Apr 21 10:11:50.551446 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:50.551169 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" event={"ID":"fe0708a0-ee89-49ac-a33a-3b807f012459","Type":"ContainerStarted","Data":"47869dff1f3aab28df0f7881ce2aeda2382f6796fa4831328c6fbb7a3793118c"} Apr 21 10:11:52.559157 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:52.559123 2543 generic.go:358] "Generic (PLEG): container finished" podID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerID="6aced71e2e42576139af64f2a409da41cb77460a7a1ac2b90ad20d6dad02574e" exitCode=0 Apr 21 10:11:52.559560 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:52.559218 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" event={"ID":"fe0708a0-ee89-49ac-a33a-3b807f012459","Type":"ContainerDied","Data":"6aced71e2e42576139af64f2a409da41cb77460a7a1ac2b90ad20d6dad02574e"} Apr 21 10:11:53.563479 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:53.563442 2543 generic.go:358] "Generic (PLEG): container finished" podID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerID="2486ca2de5d4ce08f082f2f3873ed623d9a188f4d8bcc66090bf944bbe7244ae" exitCode=0 Apr 21 10:11:53.563833 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:53.563505 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" event={"ID":"fe0708a0-ee89-49ac-a33a-3b807f012459","Type":"ContainerDied","Data":"2486ca2de5d4ce08f082f2f3873ed623d9a188f4d8bcc66090bf944bbe7244ae"} Apr 21 10:11:54.682127 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.682104 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:54.759444 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.759407 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-bundle\") pod \"fe0708a0-ee89-49ac-a33a-3b807f012459\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " Apr 21 10:11:54.759624 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.759477 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwvwm\" (UniqueName: \"kubernetes.io/projected/fe0708a0-ee89-49ac-a33a-3b807f012459-kube-api-access-cwvwm\") pod \"fe0708a0-ee89-49ac-a33a-3b807f012459\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " Apr 21 10:11:54.759624 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.759508 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-util\") pod \"fe0708a0-ee89-49ac-a33a-3b807f012459\" (UID: \"fe0708a0-ee89-49ac-a33a-3b807f012459\") " Apr 21 10:11:54.760356 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.760332 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-bundle" (OuterVolumeSpecName: "bundle") pod "fe0708a0-ee89-49ac-a33a-3b807f012459" (UID: "fe0708a0-ee89-49ac-a33a-3b807f012459"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:11:54.761667 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.761641 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0708a0-ee89-49ac-a33a-3b807f012459-kube-api-access-cwvwm" (OuterVolumeSpecName: "kube-api-access-cwvwm") pod "fe0708a0-ee89-49ac-a33a-3b807f012459" (UID: "fe0708a0-ee89-49ac-a33a-3b807f012459"). InnerVolumeSpecName "kube-api-access-cwvwm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:11:54.764850 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.764812 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-util" (OuterVolumeSpecName: "util") pod "fe0708a0-ee89-49ac-a33a-3b807f012459" (UID: "fe0708a0-ee89-49ac-a33a-3b807f012459"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:11:54.860168 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.860131 2543 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-util\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:11:54.860168 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.860167 2543 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0708a0-ee89-49ac-a33a-3b807f012459-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:11:54.860351 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:54.860184 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cwvwm\" (UniqueName: \"kubernetes.io/projected/fe0708a0-ee89-49ac-a33a-3b807f012459-kube-api-access-cwvwm\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:11:55.570237 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:55.570148 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" event={"ID":"fe0708a0-ee89-49ac-a33a-3b807f012459","Type":"ContainerDied","Data":"47869dff1f3aab28df0f7881ce2aeda2382f6796fa4831328c6fbb7a3793118c"} Apr 21 10:11:55.570237 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:55.570169 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2pnql2" Apr 21 10:11:55.570447 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:55.570180 2543 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47869dff1f3aab28df0f7881ce2aeda2382f6796fa4831328c6fbb7a3793118c" Apr 21 10:11:56.541701 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:11:56.541670 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-5h9l4" Apr 21 10:12:23.481570 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.481504 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5"] Apr 21 10:12:23.482066 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.481878 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerName="pull" Apr 21 10:12:23.482066 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.481895 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerName="pull" Apr 21 10:12:23.482066 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.481905 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerName="extract" Apr 21 10:12:23.482066 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.481913 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerName="extract" Apr 21 10:12:23.482066 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.481924 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerName="util" Apr 21 10:12:23.482066 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.481932 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerName="util" Apr 21 10:12:23.482066 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.481993 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe0708a0-ee89-49ac-a33a-3b807f012459" containerName="extract" Apr 21 10:12:23.487160 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.487138 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.489736 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.489714 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wcszx\"" Apr 21 10:12:23.489854 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.489808 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 10:12:23.490794 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.490774 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 10:12:23.494003 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.493979 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5"] Apr 21 10:12:23.558177 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.558148 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.558321 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.558188 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.558321 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.558215 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65cqh\" (UniqueName: \"kubernetes.io/projected/52f65785-5448-4962-9812-636212be7162-kube-api-access-65cqh\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.579005 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.578977 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d"] Apr 21 10:12:23.582775 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.582759 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.589763 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.589741 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d"] Apr 21 10:12:23.659059 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.659031 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.659200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.659076 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.659200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.659110 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65cqh\" (UniqueName: \"kubernetes.io/projected/52f65785-5448-4962-9812-636212be7162-kube-api-access-65cqh\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.659200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.659149 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.659200 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.659181 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbdv5\" (UniqueName: \"kubernetes.io/projected/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-kube-api-access-wbdv5\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.659375 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.659215 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.659410 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.659393 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.659443 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.659419 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.667827 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.667767 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65cqh\" (UniqueName: \"kubernetes.io/projected/52f65785-5448-4962-9812-636212be7162-kube-api-access-65cqh\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.679719 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.679695 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w"] Apr 21 10:12:23.683441 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.683427 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.689939 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.689913 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w"] Apr 21 10:12:23.760262 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.760188 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.760262 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.760231 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5gf\" (UniqueName: \"kubernetes.io/projected/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-kube-api-access-pk5gf\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.760455 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.760314 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.760455 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.760348 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.760455 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.760377 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbdv5\" (UniqueName: \"kubernetes.io/projected/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-kube-api-access-wbdv5\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.760455 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.760401 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.760618 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.760534 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.760691 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.760673 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.773388 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.773364 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbdv5\" (UniqueName: \"kubernetes.io/projected/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-kube-api-access-wbdv5\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.779683 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.779661 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p"] Apr 21 10:12:23.783417 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.783402 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.792022 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.791999 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p"] Apr 21 10:12:23.797256 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.797235 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:23.860851 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.860825 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.860980 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.860867 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.860980 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.860907 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5gf\" (UniqueName: \"kubernetes.io/projected/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-kube-api-access-pk5gf\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.860980 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.860958 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.861136 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.860986 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfvsb\" (UniqueName: \"kubernetes.io/projected/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-kube-api-access-lfvsb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.861136 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.861019 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.861239 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.861199 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.861289 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.861263 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.870058 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.870033 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5gf\" (UniqueName: \"kubernetes.io/projected/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-kube-api-access-pk5gf\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:23.891746 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.891722 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:23.919212 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.919188 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5"] Apr 21 10:12:23.921187 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:12:23.921160 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f65785_5448_4962_9812_636212be7162.slice/crio-65384c67c7ec369ae4bd1ab7fb06a96930ac5108e9895721f3dbda7458fa3d88 WatchSource:0}: Error finding container 65384c67c7ec369ae4bd1ab7fb06a96930ac5108e9895721f3dbda7458fa3d88: Status 404 returned error can't find the container with id 65384c67c7ec369ae4bd1ab7fb06a96930ac5108e9895721f3dbda7458fa3d88 Apr 21 10:12:23.961571 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.961515 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.961705 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.961678 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.961774 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.961716 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfvsb\" (UniqueName: \"kubernetes.io/projected/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-kube-api-access-lfvsb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.961942 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.961916 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.962044 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.961970 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.978503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.978477 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfvsb\" (UniqueName: \"kubernetes.io/projected/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-kube-api-access-lfvsb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:23.992970 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:23.992946 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:24.033035 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.033007 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d"] Apr 21 10:12:24.069353 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:12:24.067933 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6fc070_5a78_4e3f_9cba_a83bd98827ef.slice/crio-7d7c9bd5944fed61ab581d46a3dbd94c8a7c644c5195f2360bbe4c35d91d8556 WatchSource:0}: Error finding container 7d7c9bd5944fed61ab581d46a3dbd94c8a7c644c5195f2360bbe4c35d91d8556: Status 404 returned error can't find the container with id 7d7c9bd5944fed61ab581d46a3dbd94c8a7c644c5195f2360bbe4c35d91d8556 Apr 21 10:12:24.092678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.092655 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:24.116581 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.116524 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w"] Apr 21 10:12:24.119073 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:12:24.119045 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267ecaec_d4f9_4c42_90ee_c17f9172c1e5.slice/crio-9d0ad3dc0ad4a7d29a3be6c2a55214ff3f40f71e5f8e389a7db4ba39c47a9735 WatchSource:0}: Error finding container 9d0ad3dc0ad4a7d29a3be6c2a55214ff3f40f71e5f8e389a7db4ba39c47a9735: Status 404 returned error can't find the container with id 9d0ad3dc0ad4a7d29a3be6c2a55214ff3f40f71e5f8e389a7db4ba39c47a9735 Apr 21 10:12:24.223897 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.223873 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p"] Apr 21 10:12:24.260330 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:12:24.260302 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe62d3a0_8d64_4c49_bf63_eda15c5bafff.slice/crio-4679c9a8e5f2d3f43c51cae14f93959ee34337567b3d032270a0c0c5946715d1 WatchSource:0}: Error finding container 4679c9a8e5f2d3f43c51cae14f93959ee34337567b3d032270a0c0c5946715d1: Status 404 returned error can't find the container with id 4679c9a8e5f2d3f43c51cae14f93959ee34337567b3d032270a0c0c5946715d1 Apr 21 10:12:24.465570 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.450962 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-96d4958b5-drlsc"] Apr 21 10:12:24.659043 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.659007 2543 generic.go:358] "Generic (PLEG): container finished" podID="52f65785-5448-4962-9812-636212be7162" containerID="870047c5ef9faf688931e3146ecddcbe127ee3c10a95643efe00fc325f956154" exitCode=0 Apr 21 10:12:24.659465 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.659092 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" event={"ID":"52f65785-5448-4962-9812-636212be7162","Type":"ContainerDied","Data":"870047c5ef9faf688931e3146ecddcbe127ee3c10a95643efe00fc325f956154"} Apr 21 10:12:24.659465 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.659127 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" event={"ID":"52f65785-5448-4962-9812-636212be7162","Type":"ContainerStarted","Data":"65384c67c7ec369ae4bd1ab7fb06a96930ac5108e9895721f3dbda7458fa3d88"} Apr 21 10:12:24.660503 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.660480 2543 generic.go:358] "Generic (PLEG): container finished" podID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerID="6591e1a0467ba240f58fd8cf9116d0e0ab4d0cbab72399417c6964f5a20c0701" exitCode=0 Apr 21 10:12:24.660590 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.660524 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" event={"ID":"fe62d3a0-8d64-4c49-bf63-eda15c5bafff","Type":"ContainerDied","Data":"6591e1a0467ba240f58fd8cf9116d0e0ab4d0cbab72399417c6964f5a20c0701"} Apr 21 10:12:24.660590 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.660569 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" event={"ID":"fe62d3a0-8d64-4c49-bf63-eda15c5bafff","Type":"ContainerStarted","Data":"4679c9a8e5f2d3f43c51cae14f93959ee34337567b3d032270a0c0c5946715d1"} Apr 21 10:12:24.661966 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.661947 2543 generic.go:358] "Generic (PLEG): container finished" podID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerID="7a760f081092b62e7d448e63d6c679677eb5abe866d321ccd974ddbdaac26a58" exitCode=0 Apr 21 10:12:24.662081 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.662062 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" event={"ID":"7a6fc070-5a78-4e3f-9cba-a83bd98827ef","Type":"ContainerDied","Data":"7a760f081092b62e7d448e63d6c679677eb5abe866d321ccd974ddbdaac26a58"} Apr 21 10:12:24.662139 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.662093 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" event={"ID":"7a6fc070-5a78-4e3f-9cba-a83bd98827ef","Type":"ContainerStarted","Data":"7d7c9bd5944fed61ab581d46a3dbd94c8a7c644c5195f2360bbe4c35d91d8556"} Apr 21 10:12:24.668665 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.668633 2543 generic.go:358] "Generic (PLEG): container finished" podID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerID="f77bed5a724781b4d1931bbd426842dc258e07963fdae82bcf0425971f04832f" exitCode=0 Apr 21 10:12:24.668872 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.668668 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" event={"ID":"267ecaec-d4f9-4c42-90ee-c17f9172c1e5","Type":"ContainerDied","Data":"f77bed5a724781b4d1931bbd426842dc258e07963fdae82bcf0425971f04832f"} Apr 21 10:12:24.668872 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:24.668701 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" event={"ID":"267ecaec-d4f9-4c42-90ee-c17f9172c1e5","Type":"ContainerStarted","Data":"9d0ad3dc0ad4a7d29a3be6c2a55214ff3f40f71e5f8e389a7db4ba39c47a9735"} Apr 21 10:12:25.673512 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:25.673483 2543 generic.go:358] "Generic (PLEG): container finished" podID="52f65785-5448-4962-9812-636212be7162" containerID="ff20d53e1fbaee4a2e1b6851db146fd61bd57ddae29ca58b49596bed1669b0fc" exitCode=0 Apr 21 10:12:25.673939 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:25.673589 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" event={"ID":"52f65785-5448-4962-9812-636212be7162","Type":"ContainerDied","Data":"ff20d53e1fbaee4a2e1b6851db146fd61bd57ddae29ca58b49596bed1669b0fc"} Apr 21 10:12:25.678297 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:25.678269 2543 generic.go:358] "Generic (PLEG): container finished" podID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerID="ad9518dfb993ade19607fbabbfdc7d275631678bbe5fd19887de02a97a93bf9c" exitCode=0 Apr 21 10:12:25.678403 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:25.678343 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" event={"ID":"fe62d3a0-8d64-4c49-bf63-eda15c5bafff","Type":"ContainerDied","Data":"ad9518dfb993ade19607fbabbfdc7d275631678bbe5fd19887de02a97a93bf9c"} Apr 21 10:12:25.680902 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:25.680843 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" event={"ID":"7a6fc070-5a78-4e3f-9cba-a83bd98827ef","Type":"ContainerStarted","Data":"12a24625408c336e7cc6db5953dfb65e556200c954ae9dfa655b4f20b44daca7"} Apr 21 10:12:26.686820 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:26.686788 2543 generic.go:358] "Generic (PLEG): container finished" podID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerID="db959d452d8cb42e347939fd562916a603316ce89888d1ddd30276ac2d4d5a91" exitCode=0 Apr 21 10:12:26.687229 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:26.686877 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" event={"ID":"267ecaec-d4f9-4c42-90ee-c17f9172c1e5","Type":"ContainerDied","Data":"db959d452d8cb42e347939fd562916a603316ce89888d1ddd30276ac2d4d5a91"} Apr 21 10:12:26.688919 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:26.688896 2543 generic.go:358] "Generic (PLEG): container finished" podID="52f65785-5448-4962-9812-636212be7162" containerID="96b659558a8f1cdfb80fca03634945893eb7aae208bebee892ad21efbe024cfe" exitCode=0 Apr 21 10:12:26.689022 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:26.688970 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" event={"ID":"52f65785-5448-4962-9812-636212be7162","Type":"ContainerDied","Data":"96b659558a8f1cdfb80fca03634945893eb7aae208bebee892ad21efbe024cfe"} Apr 21 10:12:26.690846 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:26.690822 2543 generic.go:358] "Generic (PLEG): container finished" podID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerID="a0ab70d39ccdff8e6030f84dfe4a277b3b1e18a9ded83aebf6e0d5fa2a62c975" exitCode=0 Apr 21 10:12:26.690941 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:26.690892 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" event={"ID":"fe62d3a0-8d64-4c49-bf63-eda15c5bafff","Type":"ContainerDied","Data":"a0ab70d39ccdff8e6030f84dfe4a277b3b1e18a9ded83aebf6e0d5fa2a62c975"} Apr 21 10:12:26.692469 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:26.692446 2543 generic.go:358] "Generic (PLEG): container finished" podID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerID="12a24625408c336e7cc6db5953dfb65e556200c954ae9dfa655b4f20b44daca7" exitCode=0 Apr 21 10:12:26.692578 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:26.692494 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" event={"ID":"7a6fc070-5a78-4e3f-9cba-a83bd98827ef","Type":"ContainerDied","Data":"12a24625408c336e7cc6db5953dfb65e556200c954ae9dfa655b4f20b44daca7"} Apr 21 10:12:27.698125 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.698091 2543 generic.go:358] "Generic (PLEG): container finished" podID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerID="1124e9a387fd03f14f017cf0f7efc11edd480ad555bd2ddd63db151dc56cf275" exitCode=0 Apr 21 10:12:27.698539 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.698178 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" event={"ID":"7a6fc070-5a78-4e3f-9cba-a83bd98827ef","Type":"ContainerDied","Data":"1124e9a387fd03f14f017cf0f7efc11edd480ad555bd2ddd63db151dc56cf275"} Apr 21 10:12:27.699931 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.699909 2543 generic.go:358] "Generic (PLEG): container finished" podID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerID="0e0b5c40522534930fdd47acd6ae1befea623757c8bb3638faf46758947b4bbc" exitCode=0 Apr 21 10:12:27.700019 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.699951 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" event={"ID":"267ecaec-d4f9-4c42-90ee-c17f9172c1e5","Type":"ContainerDied","Data":"0e0b5c40522534930fdd47acd6ae1befea623757c8bb3638faf46758947b4bbc"} Apr 21 10:12:27.828283 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.828262 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:27.856906 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.856884 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:27.893670 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.893648 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-util\") pod \"52f65785-5448-4962-9812-636212be7162\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " Apr 21 10:12:27.893784 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.893678 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-bundle\") pod \"52f65785-5448-4962-9812-636212be7162\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " Apr 21 10:12:27.893784 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.893705 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65cqh\" (UniqueName: \"kubernetes.io/projected/52f65785-5448-4962-9812-636212be7162-kube-api-access-65cqh\") pod \"52f65785-5448-4962-9812-636212be7162\" (UID: \"52f65785-5448-4962-9812-636212be7162\") " Apr 21 10:12:27.893784 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.893742 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-bundle\") pod \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " Apr 21 10:12:27.893784 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.893768 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-util\") pod \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " Apr 21 10:12:27.894251 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.894226 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-bundle" (OuterVolumeSpecName: "bundle") pod "fe62d3a0-8d64-4c49-bf63-eda15c5bafff" (UID: "fe62d3a0-8d64-4c49-bf63-eda15c5bafff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:12:27.894405 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.894388 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-bundle" (OuterVolumeSpecName: "bundle") pod "52f65785-5448-4962-9812-636212be7162" (UID: "52f65785-5448-4962-9812-636212be7162"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:12:27.895981 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.895964 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f65785-5448-4962-9812-636212be7162-kube-api-access-65cqh" (OuterVolumeSpecName: "kube-api-access-65cqh") pod "52f65785-5448-4962-9812-636212be7162" (UID: "52f65785-5448-4962-9812-636212be7162"). InnerVolumeSpecName "kube-api-access-65cqh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:12:27.901449 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.901422 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-util" (OuterVolumeSpecName: "util") pod "fe62d3a0-8d64-4c49-bf63-eda15c5bafff" (UID: "fe62d3a0-8d64-4c49-bf63-eda15c5bafff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:12:27.902136 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.902119 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-util" (OuterVolumeSpecName: "util") pod "52f65785-5448-4962-9812-636212be7162" (UID: "52f65785-5448-4962-9812-636212be7162"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:12:27.994241 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.994151 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfvsb\" (UniqueName: \"kubernetes.io/projected/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-kube-api-access-lfvsb\") pod \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\" (UID: \"fe62d3a0-8d64-4c49-bf63-eda15c5bafff\") " Apr 21 10:12:27.994373 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.994260 2543 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-util\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:27.994373 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.994271 2543 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-util\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:27.994373 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.994280 2543 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52f65785-5448-4962-9812-636212be7162-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:27.994373 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.994290 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65cqh\" (UniqueName: \"kubernetes.io/projected/52f65785-5448-4962-9812-636212be7162-kube-api-access-65cqh\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:27.994373 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.994299 2543 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:27.996313 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:27.996291 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-kube-api-access-lfvsb" (OuterVolumeSpecName: "kube-api-access-lfvsb") pod "fe62d3a0-8d64-4c49-bf63-eda15c5bafff" (UID: "fe62d3a0-8d64-4c49-bf63-eda15c5bafff"). InnerVolumeSpecName "kube-api-access-lfvsb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:12:28.094761 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.094718 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lfvsb\" (UniqueName: \"kubernetes.io/projected/fe62d3a0-8d64-4c49-bf63-eda15c5bafff-kube-api-access-lfvsb\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:28.704430 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.704393 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" event={"ID":"52f65785-5448-4962-9812-636212be7162","Type":"ContainerDied","Data":"65384c67c7ec369ae4bd1ab7fb06a96930ac5108e9895721f3dbda7458fa3d88"} Apr 21 10:12:28.704430 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.704420 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503jhkj5" Apr 21 10:12:28.704430 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.704427 2543 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65384c67c7ec369ae4bd1ab7fb06a96930ac5108e9895721f3dbda7458fa3d88" Apr 21 10:12:28.706386 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.706365 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" Apr 21 10:12:28.706559 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.706363 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b8t88p" event={"ID":"fe62d3a0-8d64-4c49-bf63-eda15c5bafff","Type":"ContainerDied","Data":"4679c9a8e5f2d3f43c51cae14f93959ee34337567b3d032270a0c0c5946715d1"} Apr 21 10:12:28.706559 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.706468 2543 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4679c9a8e5f2d3f43c51cae14f93959ee34337567b3d032270a0c0c5946715d1" Apr 21 10:12:28.845539 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.845520 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:28.848329 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.848308 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:28.901179 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.901147 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-bundle\") pod \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " Apr 21 10:12:28.901179 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.901176 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-util\") pod \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " Apr 21 10:12:28.901395 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.901213 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-bundle\") pod \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " Apr 21 10:12:28.901395 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.901254 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-util\") pod \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " Apr 21 10:12:28.901395 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.901274 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk5gf\" (UniqueName: \"kubernetes.io/projected/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-kube-api-access-pk5gf\") pod \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\" (UID: \"267ecaec-d4f9-4c42-90ee-c17f9172c1e5\") " Apr 21 10:12:28.901395 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.901291 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbdv5\" (UniqueName: \"kubernetes.io/projected/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-kube-api-access-wbdv5\") pod \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\" (UID: \"7a6fc070-5a78-4e3f-9cba-a83bd98827ef\") " Apr 21 10:12:28.901836 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.901812 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-bundle" (OuterVolumeSpecName: "bundle") pod "7a6fc070-5a78-4e3f-9cba-a83bd98827ef" (UID: "7a6fc070-5a78-4e3f-9cba-a83bd98827ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:12:28.901836 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.901823 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-bundle" (OuterVolumeSpecName: "bundle") pod "267ecaec-d4f9-4c42-90ee-c17f9172c1e5" (UID: "267ecaec-d4f9-4c42-90ee-c17f9172c1e5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:12:28.903581 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.903537 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-kube-api-access-wbdv5" (OuterVolumeSpecName: "kube-api-access-wbdv5") pod "7a6fc070-5a78-4e3f-9cba-a83bd98827ef" (UID: "7a6fc070-5a78-4e3f-9cba-a83bd98827ef"). InnerVolumeSpecName "kube-api-access-wbdv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:12:28.903709 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.903676 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-kube-api-access-pk5gf" (OuterVolumeSpecName: "kube-api-access-pk5gf") pod "267ecaec-d4f9-4c42-90ee-c17f9172c1e5" (UID: "267ecaec-d4f9-4c42-90ee-c17f9172c1e5"). InnerVolumeSpecName "kube-api-access-pk5gf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:12:28.907172 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.907150 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-util" (OuterVolumeSpecName: "util") pod "267ecaec-d4f9-4c42-90ee-c17f9172c1e5" (UID: "267ecaec-d4f9-4c42-90ee-c17f9172c1e5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:12:28.909400 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:28.909364 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-util" (OuterVolumeSpecName: "util") pod "7a6fc070-5a78-4e3f-9cba-a83bd98827ef" (UID: "7a6fc070-5a78-4e3f-9cba-a83bd98827ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:12:29.002106 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.002019 2543 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:29.002106 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.002048 2543 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-util\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:29.002106 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.002060 2543 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:29.002106 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.002070 2543 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-util\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:29.002106 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.002083 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pk5gf\" (UniqueName: \"kubernetes.io/projected/267ecaec-d4f9-4c42-90ee-c17f9172c1e5-kube-api-access-pk5gf\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:29.002106 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.002097 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbdv5\" (UniqueName: \"kubernetes.io/projected/7a6fc070-5a78-4e3f-9cba-a83bd98827ef-kube-api-access-wbdv5\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:29.712014 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.711978 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" event={"ID":"267ecaec-d4f9-4c42-90ee-c17f9172c1e5","Type":"ContainerDied","Data":"9d0ad3dc0ad4a7d29a3be6c2a55214ff3f40f71e5f8e389a7db4ba39c47a9735"} Apr 21 10:12:29.712014 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.712010 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88kg77w" Apr 21 10:12:29.712498 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.712016 2543 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d0ad3dc0ad4a7d29a3be6c2a55214ff3f40f71e5f8e389a7db4ba39c47a9735" Apr 21 10:12:29.713713 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.713690 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" event={"ID":"7a6fc070-5a78-4e3f-9cba-a83bd98827ef","Type":"ContainerDied","Data":"7d7c9bd5944fed61ab581d46a3dbd94c8a7c644c5195f2360bbe4c35d91d8556"} Apr 21 10:12:29.713821 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.713725 2543 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d7c9bd5944fed61ab581d46a3dbd94c8a7c644c5195f2360bbe4c35d91d8556" Apr 21 10:12:29.713821 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:29.713700 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302qp6d" Apr 21 10:12:42.666290 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666206 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l"] Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666450 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerName="pull" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666461 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerName="pull" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666473 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666478 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666485 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52f65785-5448-4962-9812-636212be7162" containerName="util" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666490 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f65785-5448-4962-9812-636212be7162" containerName="util" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666495 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerName="util" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666500 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerName="util" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666510 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerName="util" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666515 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerName="util" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666522 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52f65785-5448-4962-9812-636212be7162" containerName="pull" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666527 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f65785-5448-4962-9812-636212be7162" containerName="pull" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666532 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerName="pull" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666537 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerName="pull" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666563 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666572 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666580 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerName="util" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666585 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerName="util" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666591 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52f65785-5448-4962-9812-636212be7162" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666596 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f65785-5448-4962-9812-636212be7162" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666602 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666607 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666613 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerName="pull" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666617 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerName="pull" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666658 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="52f65785-5448-4962-9812-636212be7162" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666665 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe62d3a0-8d64-4c49-bf63-eda15c5bafff" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666670 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="267ecaec-d4f9-4c42-90ee-c17f9172c1e5" containerName="extract" Apr 21 10:12:42.666728 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.666677 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a6fc070-5a78-4e3f-9cba-a83bd98827ef" containerName="extract" Apr 21 10:12:42.669300 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.669282 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" Apr 21 10:12:42.673253 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.673229 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 10:12:42.673253 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.673243 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 10:12:42.674208 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.674189 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-nr4xl\"" Apr 21 10:12:42.682109 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.682083 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l"] Apr 21 10:12:42.685075 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.685053 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z96t\" (UniqueName: \"kubernetes.io/projected/32f01b6e-e887-4f64-ae92-86472192e8ca-kube-api-access-2z96t\") pod \"limitador-operator-controller-manager-c7fb4c8d5-pb57l\" (UID: \"32f01b6e-e887-4f64-ae92-86472192e8ca\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" Apr 21 10:12:42.785636 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.785602 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z96t\" (UniqueName: \"kubernetes.io/projected/32f01b6e-e887-4f64-ae92-86472192e8ca-kube-api-access-2z96t\") pod \"limitador-operator-controller-manager-c7fb4c8d5-pb57l\" (UID: \"32f01b6e-e887-4f64-ae92-86472192e8ca\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" Apr 21 10:12:42.794389 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.794363 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z96t\" (UniqueName: \"kubernetes.io/projected/32f01b6e-e887-4f64-ae92-86472192e8ca-kube-api-access-2z96t\") pod \"limitador-operator-controller-manager-c7fb4c8d5-pb57l\" (UID: \"32f01b6e-e887-4f64-ae92-86472192e8ca\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" Apr 21 10:12:42.947452 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.947368 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf"] Apr 21 10:12:42.950422 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.950406 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:42.952807 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.952783 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 10:12:42.952959 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.952941 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 10:12:42.953271 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.953255 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-zjcn8\"" Apr 21 10:12:42.957807 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.957786 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf"] Apr 21 10:12:42.979273 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.979236 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" Apr 21 10:12:42.987258 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.987233 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/865fabdb-4c55-4b29-8521-21e4784a5f33-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-5fkrf\" (UID: \"865fabdb-4c55-4b29-8521-21e4784a5f33\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:42.987369 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.987277 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/865fabdb-4c55-4b29-8521-21e4784a5f33-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-5fkrf\" (UID: \"865fabdb-4c55-4b29-8521-21e4784a5f33\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:42.987415 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:42.987397 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp5zd\" (UniqueName: \"kubernetes.io/projected/865fabdb-4c55-4b29-8521-21e4784a5f33-kube-api-access-fp5zd\") pod \"kuadrant-console-plugin-6c886788f8-5fkrf\" (UID: \"865fabdb-4c55-4b29-8521-21e4784a5f33\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:43.088065 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.088026 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fp5zd\" (UniqueName: \"kubernetes.io/projected/865fabdb-4c55-4b29-8521-21e4784a5f33-kube-api-access-fp5zd\") pod \"kuadrant-console-plugin-6c886788f8-5fkrf\" (UID: \"865fabdb-4c55-4b29-8521-21e4784a5f33\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:43.088218 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.088083 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/865fabdb-4c55-4b29-8521-21e4784a5f33-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-5fkrf\" (UID: \"865fabdb-4c55-4b29-8521-21e4784a5f33\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:43.088218 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.088103 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/865fabdb-4c55-4b29-8521-21e4784a5f33-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-5fkrf\" (UID: \"865fabdb-4c55-4b29-8521-21e4784a5f33\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:43.088652 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.088634 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/865fabdb-4c55-4b29-8521-21e4784a5f33-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-5fkrf\" (UID: \"865fabdb-4c55-4b29-8521-21e4784a5f33\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:43.090642 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.090618 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/865fabdb-4c55-4b29-8521-21e4784a5f33-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-5fkrf\" (UID: \"865fabdb-4c55-4b29-8521-21e4784a5f33\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:43.096039 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.096018 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp5zd\" (UniqueName: \"kubernetes.io/projected/865fabdb-4c55-4b29-8521-21e4784a5f33-kube-api-access-fp5zd\") pod \"kuadrant-console-plugin-6c886788f8-5fkrf\" (UID: \"865fabdb-4c55-4b29-8521-21e4784a5f33\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:43.103208 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.103186 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l"] Apr 21 10:12:43.105392 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:12:43.105370 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f01b6e_e887_4f64_ae92_86472192e8ca.slice/crio-2442fc582f099aaad7bcca6df86abff881236da6c5cc63e5ad4381a47a4a4a00 WatchSource:0}: Error finding container 2442fc582f099aaad7bcca6df86abff881236da6c5cc63e5ad4381a47a4a4a00: Status 404 returned error can't find the container with id 2442fc582f099aaad7bcca6df86abff881236da6c5cc63e5ad4381a47a4a4a00 Apr 21 10:12:43.260593 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.260497 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" Apr 21 10:12:43.378841 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.378750 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf"] Apr 21 10:12:43.381078 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:12:43.381052 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod865fabdb_4c55_4b29_8521_21e4784a5f33.slice/crio-c3f38645d8a366a0ce9f7154c4e95e4823948d21333f437b58c23563539c5de5 WatchSource:0}: Error finding container c3f38645d8a366a0ce9f7154c4e95e4823948d21333f437b58c23563539c5de5: Status 404 returned error can't find the container with id c3f38645d8a366a0ce9f7154c4e95e4823948d21333f437b58c23563539c5de5 Apr 21 10:12:43.761944 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.761912 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" event={"ID":"865fabdb-4c55-4b29-8521-21e4784a5f33","Type":"ContainerStarted","Data":"c3f38645d8a366a0ce9f7154c4e95e4823948d21333f437b58c23563539c5de5"} Apr 21 10:12:43.763055 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:43.763034 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" event={"ID":"32f01b6e-e887-4f64-ae92-86472192e8ca","Type":"ContainerStarted","Data":"2442fc582f099aaad7bcca6df86abff881236da6c5cc63e5ad4381a47a4a4a00"} Apr 21 10:12:46.776723 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:46.776664 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" event={"ID":"32f01b6e-e887-4f64-ae92-86472192e8ca","Type":"ContainerStarted","Data":"627af1d5e0e7c0c8349b54e40718c6a7ea558055fe330fe01497e2401f7a7071"} Apr 21 10:12:46.777196 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:46.776806 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" Apr 21 10:12:46.795781 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:46.795729 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" podStartSLOduration=2.128238929 podStartE2EDuration="4.795715173s" podCreationTimestamp="2026-04-21 10:12:42 +0000 UTC" firstStartedPulling="2026-04-21 10:12:43.107347857 +0000 UTC m=+532.510271795" lastFinishedPulling="2026-04-21 10:12:45.7748241 +0000 UTC m=+535.177748039" observedRunningTime="2026-04-21 10:12:46.794835448 +0000 UTC m=+536.197759407" watchObservedRunningTime="2026-04-21 10:12:46.795715173 +0000 UTC m=+536.198639131" Apr 21 10:12:48.785025 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:48.784930 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" event={"ID":"865fabdb-4c55-4b29-8521-21e4784a5f33","Type":"ContainerStarted","Data":"3191fbe45ff1fc280aa7fbe63232ba971961e27b31e7e261b0572a7ba1080c2b"} Apr 21 10:12:48.802558 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:48.802494 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-5fkrf" podStartSLOduration=1.842663984 podStartE2EDuration="6.802480151s" podCreationTimestamp="2026-04-21 10:12:42 +0000 UTC" firstStartedPulling="2026-04-21 10:12:43.382576658 +0000 UTC m=+532.785500597" lastFinishedPulling="2026-04-21 10:12:48.34239282 +0000 UTC m=+537.745316764" observedRunningTime="2026-04-21 10:12:48.800567523 +0000 UTC m=+538.203491530" watchObservedRunningTime="2026-04-21 10:12:48.802480151 +0000 UTC m=+538.205404108" Apr 21 10:12:49.477001 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.476964 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-96d4958b5-drlsc" podUID="3b6d8d2c-693e-4edb-9709-9c56105eefbe" containerName="console" containerID="cri-o://29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713" gracePeriod=15 Apr 21 10:12:49.705270 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.705245 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-96d4958b5-drlsc_3b6d8d2c-693e-4edb-9709-9c56105eefbe/console/0.log" Apr 21 10:12:49.705384 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.705307 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:12:49.744128 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744060 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-service-ca\") pod \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " Apr 21 10:12:49.744128 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744097 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-trusted-ca-bundle\") pod \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " Apr 21 10:12:49.744128 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744118 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-oauth-serving-cert\") pod \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " Apr 21 10:12:49.744388 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744159 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-oauth-config\") pod \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " Apr 21 10:12:49.744388 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744181 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd7xt\" (UniqueName: \"kubernetes.io/projected/3b6d8d2c-693e-4edb-9709-9c56105eefbe-kube-api-access-cd7xt\") pod \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " Apr 21 10:12:49.744388 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744204 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-serving-cert\") pod \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " Apr 21 10:12:49.744388 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744240 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-config\") pod \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\" (UID: \"3b6d8d2c-693e-4edb-9709-9c56105eefbe\") " Apr 21 10:12:49.744647 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744482 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-service-ca" (OuterVolumeSpecName: "service-ca") pod "3b6d8d2c-693e-4edb-9709-9c56105eefbe" (UID: "3b6d8d2c-693e-4edb-9709-9c56105eefbe"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:12:49.744698 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744649 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3b6d8d2c-693e-4edb-9709-9c56105eefbe" (UID: "3b6d8d2c-693e-4edb-9709-9c56105eefbe"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:12:49.744734 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744705 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3b6d8d2c-693e-4edb-9709-9c56105eefbe" (UID: "3b6d8d2c-693e-4edb-9709-9c56105eefbe"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:12:49.744775 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.744742 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-config" (OuterVolumeSpecName: "console-config") pod "3b6d8d2c-693e-4edb-9709-9c56105eefbe" (UID: "3b6d8d2c-693e-4edb-9709-9c56105eefbe"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:12:49.746595 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.746575 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3b6d8d2c-693e-4edb-9709-9c56105eefbe" (UID: "3b6d8d2c-693e-4edb-9709-9c56105eefbe"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:12:49.746595 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.746583 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3b6d8d2c-693e-4edb-9709-9c56105eefbe" (UID: "3b6d8d2c-693e-4edb-9709-9c56105eefbe"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:12:49.746702 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.746669 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b6d8d2c-693e-4edb-9709-9c56105eefbe-kube-api-access-cd7xt" (OuterVolumeSpecName: "kube-api-access-cd7xt") pod "3b6d8d2c-693e-4edb-9709-9c56105eefbe" (UID: "3b6d8d2c-693e-4edb-9709-9c56105eefbe"). InnerVolumeSpecName "kube-api-access-cd7xt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:12:49.789944 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.789920 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-96d4958b5-drlsc_3b6d8d2c-693e-4edb-9709-9c56105eefbe/console/0.log" Apr 21 10:12:49.790367 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.789958 2543 generic.go:358] "Generic (PLEG): container finished" podID="3b6d8d2c-693e-4edb-9709-9c56105eefbe" containerID="29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713" exitCode=2 Apr 21 10:12:49.790367 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.790040 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-96d4958b5-drlsc" Apr 21 10:12:49.790367 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.790053 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-96d4958b5-drlsc" event={"ID":"3b6d8d2c-693e-4edb-9709-9c56105eefbe","Type":"ContainerDied","Data":"29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713"} Apr 21 10:12:49.790367 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.790092 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-96d4958b5-drlsc" event={"ID":"3b6d8d2c-693e-4edb-9709-9c56105eefbe","Type":"ContainerDied","Data":"b5f8d2e67f0272995242829c0da2d1915e62d726c3d4a7b0b46f816edb8d0848"} Apr 21 10:12:49.790367 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.790111 2543 scope.go:117] "RemoveContainer" containerID="29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713" Apr 21 10:12:49.798617 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.798600 2543 scope.go:117] "RemoveContainer" containerID="29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713" Apr 21 10:12:49.798836 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:12:49.798817 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713\": container with ID starting with 29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713 not found: ID does not exist" containerID="29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713" Apr 21 10:12:49.798880 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.798843 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713"} err="failed to get container status \"29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713\": rpc error: code = NotFound desc = could not find container \"29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713\": container with ID starting with 29300deff75f82aa952e5d6b7013539b6c693288bf64738da958ee22e0455713 not found: ID does not exist" Apr 21 10:12:49.810597 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.810570 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-96d4958b5-drlsc"] Apr 21 10:12:49.813918 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.813898 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-96d4958b5-drlsc"] Apr 21 10:12:49.845576 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.845524 2543 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-oauth-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:49.845576 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.845573 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cd7xt\" (UniqueName: \"kubernetes.io/projected/3b6d8d2c-693e-4edb-9709-9c56105eefbe-kube-api-access-cd7xt\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:49.845694 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.845586 2543 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-serving-cert\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:49.845694 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.845600 2543 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-console-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:49.845694 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.845613 2543 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-service-ca\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:49.845694 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.845622 2543 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-trusted-ca-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:49.845694 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:49.845630 2543 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b6d8d2c-693e-4edb-9709-9c56105eefbe-oauth-serving-cert\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:12:51.150910 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:51.150863 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b6d8d2c-693e-4edb-9709-9c56105eefbe" path="/var/lib/kubelet/pods/3b6d8d2c-693e-4edb-9709-9c56105eefbe/volumes" Apr 21 10:12:57.783153 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:12:57.783122 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-pb57l" Apr 21 10:23:47.362340 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.362306 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znjg8/must-gather-pk98q"] Apr 21 10:23:47.364664 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.362585 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b6d8d2c-693e-4edb-9709-9c56105eefbe" containerName="console" Apr 21 10:23:47.364664 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.362596 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6d8d2c-693e-4edb-9709-9c56105eefbe" containerName="console" Apr 21 10:23:47.364664 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.362648 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b6d8d2c-693e-4edb-9709-9c56105eefbe" containerName="console" Apr 21 10:23:47.365517 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.365501 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:23:47.368197 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.368174 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-znjg8\"/\"kube-root-ca.crt\"" Apr 21 10:23:47.368318 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.368174 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-znjg8\"/\"openshift-service-ca.crt\"" Apr 21 10:23:47.369138 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.369120 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-znjg8\"/\"default-dockercfg-lvjxs\"" Apr 21 10:23:47.381091 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.381067 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znjg8/must-gather-pk98q"] Apr 21 10:23:47.462798 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.462764 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d16de088-a440-4604-80c5-8127190d441c-must-gather-output\") pod \"must-gather-pk98q\" (UID: \"d16de088-a440-4604-80c5-8127190d441c\") " pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:23:47.462973 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.462807 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzsp9\" (UniqueName: \"kubernetes.io/projected/d16de088-a440-4604-80c5-8127190d441c-kube-api-access-nzsp9\") pod \"must-gather-pk98q\" (UID: \"d16de088-a440-4604-80c5-8127190d441c\") " pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:23:47.563719 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.563687 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d16de088-a440-4604-80c5-8127190d441c-must-gather-output\") pod \"must-gather-pk98q\" (UID: \"d16de088-a440-4604-80c5-8127190d441c\") " pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:23:47.563719 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.563719 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzsp9\" (UniqueName: \"kubernetes.io/projected/d16de088-a440-4604-80c5-8127190d441c-kube-api-access-nzsp9\") pod \"must-gather-pk98q\" (UID: \"d16de088-a440-4604-80c5-8127190d441c\") " pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:23:47.564033 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.564011 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d16de088-a440-4604-80c5-8127190d441c-must-gather-output\") pod \"must-gather-pk98q\" (UID: \"d16de088-a440-4604-80c5-8127190d441c\") " pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:23:47.578868 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.578844 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzsp9\" (UniqueName: \"kubernetes.io/projected/d16de088-a440-4604-80c5-8127190d441c-kube-api-access-nzsp9\") pod \"must-gather-pk98q\" (UID: \"d16de088-a440-4604-80c5-8127190d441c\") " pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:23:47.674781 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.674696 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:23:47.795462 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.795422 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znjg8/must-gather-pk98q"] Apr 21 10:23:47.797531 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:23:47.797506 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd16de088_a440_4604_80c5_8127190d441c.slice/crio-3779d67b6aae8bbdee80eb7ded379146785aa14b42e5575b1ddb2100349dcb4f WatchSource:0}: Error finding container 3779d67b6aae8bbdee80eb7ded379146785aa14b42e5575b1ddb2100349dcb4f: Status 404 returned error can't find the container with id 3779d67b6aae8bbdee80eb7ded379146785aa14b42e5575b1ddb2100349dcb4f Apr 21 10:23:47.799104 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.799086 2543 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:23:47.943649 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:47.943565 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjg8/must-gather-pk98q" event={"ID":"d16de088-a440-4604-80c5-8127190d441c","Type":"ContainerStarted","Data":"3779d67b6aae8bbdee80eb7ded379146785aa14b42e5575b1ddb2100349dcb4f"} Apr 21 10:23:53.967488 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:53.967445 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjg8/must-gather-pk98q" event={"ID":"d16de088-a440-4604-80c5-8127190d441c","Type":"ContainerStarted","Data":"bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73"} Apr 21 10:23:53.967488 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:53.967495 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjg8/must-gather-pk98q" event={"ID":"d16de088-a440-4604-80c5-8127190d441c","Type":"ContainerStarted","Data":"229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab"} Apr 21 10:23:53.983358 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:23:53.983284 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-znjg8/must-gather-pk98q" podStartSLOduration=1.912040582 podStartE2EDuration="6.983264955s" podCreationTimestamp="2026-04-21 10:23:47 +0000 UTC" firstStartedPulling="2026-04-21 10:23:47.79920904 +0000 UTC m=+1197.202132975" lastFinishedPulling="2026-04-21 10:23:52.870433395 +0000 UTC m=+1202.273357348" observedRunningTime="2026-04-21 10:23:53.982307303 +0000 UTC m=+1203.385231259" watchObservedRunningTime="2026-04-21 10:23:53.983264955 +0000 UTC m=+1203.386188913" Apr 21 10:24:02.552016 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:02.551985 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-5fkrf_865fabdb-4c55-4b29-8521-21e4784a5f33/kuadrant-console-plugin/0.log" Apr 21 10:24:02.591715 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:02.591679 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-pb57l_32f01b6e-e887-4f64-ae92-86472192e8ca/manager/0.log" Apr 21 10:24:06.011060 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:06.011026 2543 generic.go:358] "Generic (PLEG): container finished" podID="d16de088-a440-4604-80c5-8127190d441c" containerID="229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab" exitCode=0 Apr 21 10:24:06.011602 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:06.011107 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znjg8/must-gather-pk98q" event={"ID":"d16de088-a440-4604-80c5-8127190d441c","Type":"ContainerDied","Data":"229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab"} Apr 21 10:24:06.011602 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:06.011519 2543 scope.go:117] "RemoveContainer" containerID="229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab" Apr 21 10:24:06.233852 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:06.233813 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-znjg8_must-gather-pk98q_d16de088-a440-4604-80c5-8127190d441c/gather/0.log" Apr 21 10:24:09.544066 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:09.544035 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lnpnt_96143cfb-12f0-4d9d-bf83-e55429b937c9/global-pull-secret-syncer/0.log" Apr 21 10:24:09.656563 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:09.656512 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-w982h_687fb9cb-bcbd-4bee-961a-a42877702749/konnectivity-agent/0.log" Apr 21 10:24:09.700244 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:09.700211 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-144.ec2.internal_bc4de9b4ca4b46a01bc475a4f767a339/haproxy/0.log" Apr 21 10:24:11.617472 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.617440 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-znjg8/must-gather-pk98q"] Apr 21 10:24:11.618118 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.617677 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-znjg8/must-gather-pk98q" podUID="d16de088-a440-4604-80c5-8127190d441c" containerName="copy" containerID="cri-o://bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73" gracePeriod=2 Apr 21 10:24:11.619918 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.619883 2543 status_manager.go:895] "Failed to get status for pod" podUID="d16de088-a440-4604-80c5-8127190d441c" pod="openshift-must-gather-znjg8/must-gather-pk98q" err="pods \"must-gather-pk98q\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-znjg8\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 21 10:24:11.621179 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.621158 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-znjg8/must-gather-pk98q"] Apr 21 10:24:11.842865 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.842844 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-znjg8_must-gather-pk98q_d16de088-a440-4604-80c5-8127190d441c/copy/0.log" Apr 21 10:24:11.843164 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.843150 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:24:11.845643 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.845615 2543 status_manager.go:895] "Failed to get status for pod" podUID="d16de088-a440-4604-80c5-8127190d441c" pod="openshift-must-gather-znjg8/must-gather-pk98q" err="pods \"must-gather-pk98q\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-znjg8\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 21 10:24:11.951595 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.951495 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d16de088-a440-4604-80c5-8127190d441c-must-gather-output\") pod \"d16de088-a440-4604-80c5-8127190d441c\" (UID: \"d16de088-a440-4604-80c5-8127190d441c\") " Apr 21 10:24:11.951730 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.951599 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzsp9\" (UniqueName: \"kubernetes.io/projected/d16de088-a440-4604-80c5-8127190d441c-kube-api-access-nzsp9\") pod \"d16de088-a440-4604-80c5-8127190d441c\" (UID: \"d16de088-a440-4604-80c5-8127190d441c\") " Apr 21 10:24:11.953831 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.953804 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16de088-a440-4604-80c5-8127190d441c-kube-api-access-nzsp9" (OuterVolumeSpecName: "kube-api-access-nzsp9") pod "d16de088-a440-4604-80c5-8127190d441c" (UID: "d16de088-a440-4604-80c5-8127190d441c"). InnerVolumeSpecName "kube-api-access-nzsp9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:24:11.954986 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:11.954961 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16de088-a440-4604-80c5-8127190d441c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d16de088-a440-4604-80c5-8127190d441c" (UID: "d16de088-a440-4604-80c5-8127190d441c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:24:12.031628 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.031601 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-znjg8_must-gather-pk98q_d16de088-a440-4604-80c5-8127190d441c/copy/0.log" Apr 21 10:24:12.031913 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.031893 2543 generic.go:358] "Generic (PLEG): container finished" podID="d16de088-a440-4604-80c5-8127190d441c" containerID="bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73" exitCode=143 Apr 21 10:24:12.031983 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.031944 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znjg8/must-gather-pk98q" Apr 21 10:24:12.031983 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.031958 2543 scope.go:117] "RemoveContainer" containerID="bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73" Apr 21 10:24:12.038329 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.034291 2543 status_manager.go:895] "Failed to get status for pod" podUID="d16de088-a440-4604-80c5-8127190d441c" pod="openshift-must-gather-znjg8/must-gather-pk98q" err="pods \"must-gather-pk98q\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-znjg8\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 21 10:24:12.043293 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.043276 2543 scope.go:117] "RemoveContainer" containerID="229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab" Apr 21 10:24:12.044759 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.044734 2543 status_manager.go:895] "Failed to get status for pod" podUID="d16de088-a440-4604-80c5-8127190d441c" pod="openshift-must-gather-znjg8/must-gather-pk98q" err="pods \"must-gather-pk98q\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-znjg8\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 21 10:24:12.053048 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.053027 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nzsp9\" (UniqueName: \"kubernetes.io/projected/d16de088-a440-4604-80c5-8127190d441c-kube-api-access-nzsp9\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:24:12.053128 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.053051 2543 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d16de088-a440-4604-80c5-8127190d441c-must-gather-output\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 21 10:24:12.056182 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.056160 2543 scope.go:117] "RemoveContainer" containerID="bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73" Apr 21 10:24:12.056428 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:24:12.056402 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73\": container with ID starting with bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73 not found: ID does not exist" containerID="bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73" Apr 21 10:24:12.056474 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.056429 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73"} err="failed to get container status \"bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73\": rpc error: code = NotFound desc = could not find container \"bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73\": container with ID starting with bd816b5d60e0d8ea7fbba8615f728ebd919aa5b3691d783b753549dc349e2b73 not found: ID does not exist" Apr 21 10:24:12.056474 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.056449 2543 scope.go:117] "RemoveContainer" containerID="229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab" Apr 21 10:24:12.056762 ip-10-0-140-144 kubenswrapper[2543]: E0421 10:24:12.056744 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab\": container with ID starting with 229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab not found: ID does not exist" containerID="229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab" Apr 21 10:24:12.056811 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.056767 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab"} err="failed to get container status \"229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab\": rpc error: code = NotFound desc = could not find container \"229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab\": container with ID starting with 229bfd9dd3cfd1cc6045db08b45be2d873e603d47c40f5b692047786cabaefab not found: ID does not exist" Apr 21 10:24:12.954678 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:12.954649 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-5fkrf_865fabdb-4c55-4b29-8521-21e4784a5f33/kuadrant-console-plugin/0.log" Apr 21 10:24:13.019153 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:13.019115 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-pb57l_32f01b6e-e887-4f64-ae92-86472192e8ca/manager/0.log" Apr 21 10:24:13.150879 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:13.150845 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16de088-a440-4604-80c5-8127190d441c" path="/var/lib/kubelet/pods/d16de088-a440-4604-80c5-8127190d441c/volumes" Apr 21 10:24:14.549742 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:14.549712 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2kmxv_2bb04fe7-fa18-4824-9af9-59a002fdcc8b/node-exporter/0.log" Apr 21 10:24:14.569646 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:14.569612 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2kmxv_2bb04fe7-fa18-4824-9af9-59a002fdcc8b/kube-rbac-proxy/0.log" Apr 21 10:24:14.592389 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:14.592368 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2kmxv_2bb04fe7-fa18-4824-9af9-59a002fdcc8b/init-textfile/0.log" Apr 21 10:24:18.203404 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.203374 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw"] Apr 21 10:24:18.203888 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.203632 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d16de088-a440-4604-80c5-8127190d441c" containerName="copy" Apr 21 10:24:18.203888 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.203643 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16de088-a440-4604-80c5-8127190d441c" containerName="copy" Apr 21 10:24:18.203888 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.203665 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d16de088-a440-4604-80c5-8127190d441c" containerName="gather" Apr 21 10:24:18.203888 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.203670 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16de088-a440-4604-80c5-8127190d441c" containerName="gather" Apr 21 10:24:18.203888 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.203710 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="d16de088-a440-4604-80c5-8127190d441c" containerName="copy" Apr 21 10:24:18.203888 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.203719 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="d16de088-a440-4604-80c5-8127190d441c" containerName="gather" Apr 21 10:24:18.206738 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.206714 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.209147 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.209124 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f55k6\"/\"kube-root-ca.crt\"" Apr 21 10:24:18.210181 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.210163 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f55k6\"/\"default-dockercfg-nx89h\"" Apr 21 10:24:18.210248 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.210163 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f55k6\"/\"openshift-service-ca.crt\"" Apr 21 10:24:18.217087 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.217056 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw"] Apr 21 10:24:18.300585 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.300534 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bpvb\" (UniqueName: \"kubernetes.io/projected/14efcf56-0e6f-40cb-be38-5f58b7cec60e-kube-api-access-8bpvb\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.300585 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.300587 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-podres\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.300805 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.300698 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-proc\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.300805 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.300741 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-lib-modules\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.300805 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.300786 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-sys\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.401787 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.401758 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bpvb\" (UniqueName: \"kubernetes.io/projected/14efcf56-0e6f-40cb-be38-5f58b7cec60e-kube-api-access-8bpvb\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.401787 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.401791 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-podres\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.401996 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.401833 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-proc\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.401996 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.401855 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-lib-modules\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.401996 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.401880 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-sys\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.401996 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.401926 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-proc\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.401996 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.401952 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-sys\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.401996 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.401966 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-podres\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.401996 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.401975 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14efcf56-0e6f-40cb-be38-5f58b7cec60e-lib-modules\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.410024 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.410003 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bpvb\" (UniqueName: \"kubernetes.io/projected/14efcf56-0e6f-40cb-be38-5f58b7cec60e-kube-api-access-8bpvb\") pod \"perf-node-gather-daemonset-mmkbw\" (UID: \"14efcf56-0e6f-40cb-be38-5f58b7cec60e\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.517077 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.517009 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:18.636639 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.636613 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw"] Apr 21 10:24:18.638443 ip-10-0-140-144 kubenswrapper[2543]: W0421 10:24:18.638409 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod14efcf56_0e6f_40cb_be38_5f58b7cec60e.slice/crio-9f2baeb2d01999bc5b44f04199abde12d6e394436df8dffcf1af6c257f363bd0 WatchSource:0}: Error finding container 9f2baeb2d01999bc5b44f04199abde12d6e394436df8dffcf1af6c257f363bd0: Status 404 returned error can't find the container with id 9f2baeb2d01999bc5b44f04199abde12d6e394436df8dffcf1af6c257f363bd0 Apr 21 10:24:18.822117 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.822035 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bvmct_8165d059-3605-4311-b7f3-ad6cd4ab874b/dns/0.log" Apr 21 10:24:18.840454 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.840433 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bvmct_8165d059-3605-4311-b7f3-ad6cd4ab874b/kube-rbac-proxy/0.log" Apr 21 10:24:18.947725 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:18.947696 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8gn9x_8d1b8400-a581-4d5c-8d5f-39cdbe726442/dns-node-resolver/0.log" Apr 21 10:24:19.059754 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:19.059715 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" event={"ID":"14efcf56-0e6f-40cb-be38-5f58b7cec60e","Type":"ContainerStarted","Data":"ed51347a4eab2d3ba74574caa2756573842dc3d027c516c5c9737e3ab52c1390"} Apr 21 10:24:19.059754 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:19.059754 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" event={"ID":"14efcf56-0e6f-40cb-be38-5f58b7cec60e","Type":"ContainerStarted","Data":"9f2baeb2d01999bc5b44f04199abde12d6e394436df8dffcf1af6c257f363bd0"} Apr 21 10:24:19.059992 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:19.059826 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:19.076172 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:19.076085 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" podStartSLOduration=1.076070835 podStartE2EDuration="1.076070835s" podCreationTimestamp="2026-04-21 10:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:24:19.074133042 +0000 UTC m=+1228.477057000" watchObservedRunningTime="2026-04-21 10:24:19.076070835 +0000 UTC m=+1228.478994840" Apr 21 10:24:19.398601 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:19.398570 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-d5f489fc7-4vt77_c6c2913a-2e69-4f0a-afd1-391ff1fb6798/registry/0.log" Apr 21 10:24:19.442758 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:19.442721 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mbz9p_76addc89-f8bd-48f1-b215-22850711d8a8/node-ca/0.log" Apr 21 10:24:20.687453 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:20.687423 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zbcx6_8c2132b5-17f3-4ef3-89d6-07554e363088/serve-healthcheck-canary/0.log" Apr 21 10:24:21.082672 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:21.082598 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2swzj_baa2cf7c-1295-4216-ad7b-9dfb4f679b84/kube-rbac-proxy/0.log" Apr 21 10:24:21.100732 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:21.100710 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2swzj_baa2cf7c-1295-4216-ad7b-9dfb4f679b84/exporter/0.log" Apr 21 10:24:21.121813 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:21.121789 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2swzj_baa2cf7c-1295-4216-ad7b-9dfb4f679b84/extractor/0.log" Apr 21 10:24:23.263795 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:23.263763 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5bbdf94c78-5h9l4_49e18b7b-5798-414b-9c17-d0c5d24a8544/manager/0.log" Apr 21 10:24:25.072011 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:25.071980 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-mmkbw" Apr 21 10:24:28.496194 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.496113 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tf4pw_9dca93be-3209-499b-9769-928f3d717103/kube-multus-additional-cni-plugins/0.log" Apr 21 10:24:28.516533 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.516506 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tf4pw_9dca93be-3209-499b-9769-928f3d717103/egress-router-binary-copy/0.log" Apr 21 10:24:28.546875 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.546845 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tf4pw_9dca93be-3209-499b-9769-928f3d717103/cni-plugins/0.log" Apr 21 10:24:28.567718 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.567685 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tf4pw_9dca93be-3209-499b-9769-928f3d717103/bond-cni-plugin/0.log" Apr 21 10:24:28.589632 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.589610 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tf4pw_9dca93be-3209-499b-9769-928f3d717103/routeoverride-cni/0.log" Apr 21 10:24:28.609097 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.609071 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tf4pw_9dca93be-3209-499b-9769-928f3d717103/whereabouts-cni-bincopy/0.log" Apr 21 10:24:28.630623 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.630598 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tf4pw_9dca93be-3209-499b-9769-928f3d717103/whereabouts-cni/0.log" Apr 21 10:24:28.866170 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.866135 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jjplc_7e9f91d5-c2d9-42cb-b675-fa8a2b0ea823/kube-multus/0.log" Apr 21 10:24:28.931432 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.931382 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7rjs4_e8bb2bdc-f702-42cf-a999-1816acd364ba/network-metrics-daemon/0.log" Apr 21 10:24:28.950714 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:28.950691 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7rjs4_e8bb2bdc-f702-42cf-a999-1816acd364ba/kube-rbac-proxy/0.log" Apr 21 10:24:29.855005 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:29.854973 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpb6h_324b8f2e-48a1-43e9-b457-d61a6b4fe663/ovn-controller/0.log" Apr 21 10:24:29.882726 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:29.882699 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpb6h_324b8f2e-48a1-43e9-b457-d61a6b4fe663/ovn-acl-logging/0.log" Apr 21 10:24:29.907665 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:29.907642 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpb6h_324b8f2e-48a1-43e9-b457-d61a6b4fe663/kube-rbac-proxy-node/0.log" Apr 21 10:24:29.927384 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:29.927362 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpb6h_324b8f2e-48a1-43e9-b457-d61a6b4fe663/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 10:24:29.942020 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:29.942001 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpb6h_324b8f2e-48a1-43e9-b457-d61a6b4fe663/northd/0.log" Apr 21 10:24:29.960394 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:29.960377 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpb6h_324b8f2e-48a1-43e9-b457-d61a6b4fe663/nbdb/0.log" Apr 21 10:24:29.979178 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:29.979156 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpb6h_324b8f2e-48a1-43e9-b457-d61a6b4fe663/sbdb/0.log" Apr 21 10:24:30.140654 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:30.140570 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpb6h_324b8f2e-48a1-43e9-b457-d61a6b4fe663/ovnkube-controller/0.log" Apr 21 10:24:31.674455 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:31.674428 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5d6q8_c91615b3-acf6-4090-ad49-f307699df3ec/network-check-target-container/0.log" Apr 21 10:24:32.699524 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:32.699489 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-x99h9_e382252c-0c4b-41ed-b8ca-d5f651a559a8/iptables-alerter/0.log" Apr 21 10:24:33.353290 ip-10-0-140-144 kubenswrapper[2543]: I0421 10:24:33.353259 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6xxzz_bd31fa67-8392-4eed-b4a6-760a1b7abbf7/tuned/0.log"