Apr 16 04:24:04.059600 ip-10-0-140-211 systemd[1]: Starting Kubernetes Kubelet... Apr 16 04:24:04.544230 ip-10-0-140-211 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 04:24:04.544230 ip-10-0-140-211 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 04:24:04.544230 ip-10-0-140-211 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 04:24:04.544230 ip-10-0-140-211 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 04:24:04.544230 ip-10-0-140-211 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 04:24:04.546117 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.546026 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 04:24:04.549274 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549258 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.549274 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549275 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549279 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549282 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549286 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549289 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549308 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549311 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549315 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549318 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549323 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549328 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549331 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549334 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549337 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549340 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549343 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549345 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549348 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549351 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549354 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.549348 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549357 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549361 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549373 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549376 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549379 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549382 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549384 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549387 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549390 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549392 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549396 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549399 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549401 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549404 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549406 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549409 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549412 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549414 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549417 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549419 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.549819 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549422 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549424 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549427 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549429 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549432 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549434 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549437 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549439 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549442 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549444 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549447 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549449 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549452 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549454 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549458 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549461 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549463 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549466 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549469 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549471 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.550353 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549474 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549477 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549479 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549482 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549484 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549488 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549492 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549496 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549498 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549501 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549504 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549507 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549510 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549513 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549515 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549518 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549521 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549523 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549526 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.550839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549528 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549531 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549533 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549536 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549538 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549541 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549938 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549944 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549947 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549950 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549953 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549956 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549958 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549961 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549963 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549966 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549969 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549971 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549974 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.551289 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549993 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.549997 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550000 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550004 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550007 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550010 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550014 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550018 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550021 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550024 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550027 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550030 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550033 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550038 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550042 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550045 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550048 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550051 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550054 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.551765 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550057 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550060 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550062 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550065 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550068 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550070 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550073 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550075 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550078 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550080 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550083 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550086 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550088 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550091 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550093 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550096 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550098 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550102 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550105 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.552228 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550108 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550111 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550114 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550117 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550119 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550122 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550124 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550127 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550129 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550132 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550134 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550136 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550139 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550141 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550144 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550146 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550149 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550152 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550154 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550156 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550159 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.552713 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550161 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550164 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550166 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550169 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550171 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550173 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550176 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550179 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550181 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550185 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550188 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550191 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550193 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.550196 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551515 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551525 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551531 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551536 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551541 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551545 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551549 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 04:24:04.553242 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551553 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551557 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551560 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551563 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551567 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551570 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551573 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551576 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551579 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551582 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551585 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551588 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551593 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551596 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551599 2578 flags.go:64] FLAG: --config-dir="" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551602 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551605 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551610 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551613 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551616 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551620 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551623 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551627 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551630 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551634 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 04:24:04.553771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551636 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551641 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551644 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551647 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551650 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551653 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551656 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551660 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551664 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551667 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551670 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551673 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551676 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551679 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551682 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551686 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551688 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551691 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551694 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551697 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551700 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551703 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551706 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551710 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551713 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 04:24:04.554393 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551716 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551719 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551723 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551726 2578 flags.go:64] FLAG: --help="false" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551729 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551733 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551736 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551739 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551742 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551745 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551748 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551751 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551754 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551757 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551760 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551763 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551765 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551769 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551772 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551776 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551779 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551782 2578 flags.go:64] FLAG: --lock-file="" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551784 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551787 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 04:24:04.555023 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551790 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551801 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551804 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551807 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551810 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551813 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551816 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551819 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551822 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551826 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551830 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551834 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551837 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551840 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551843 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551846 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551849 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551852 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551855 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551862 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551865 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551868 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551871 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 04:24:04.555598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551874 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551880 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551883 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551887 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551889 2578 flags.go:64] FLAG: --port="10250" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551892 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551895 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07a40eb8838e91621" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551898 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551901 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551904 2578 flags.go:64] FLAG: --register-node="true" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551907 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551909 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551913 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551916 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551919 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551921 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551925 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551928 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551931 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551934 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551937 2578 flags.go:64] FLAG: --runonce="false" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551940 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551943 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551947 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551950 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551953 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 04:24:04.556269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551956 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551959 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551962 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551965 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551968 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551971 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551974 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551977 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551980 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551983 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551989 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551992 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551994 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.551999 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552002 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552004 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552007 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552010 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552014 2578 flags.go:64] FLAG: --v="2" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552018 2578 flags.go:64] FLAG: --version="false" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552022 2578 flags.go:64] FLAG: --vmodule="" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552026 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552029 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552142 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552145 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.556921 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552149 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552152 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552155 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552158 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552161 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552164 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552173 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552175 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552178 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552181 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552184 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552186 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552189 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552192 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552194 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552197 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552200 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552203 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552205 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.557533 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552208 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552210 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552213 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552215 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552218 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552220 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552223 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552225 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552228 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552230 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552233 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552236 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552239 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552242 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552244 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552247 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552250 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552252 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552256 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552260 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.558022 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552269 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552272 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552275 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552277 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552280 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552282 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552285 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552287 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552290 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552306 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552309 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552315 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552318 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552321 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552323 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552326 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552328 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552331 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552334 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552336 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.558527 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552339 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552341 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552344 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552346 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552349 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552353 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552360 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552363 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552366 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552369 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552371 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552374 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552377 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552380 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552383 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552386 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552388 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552391 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552393 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.559026 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552396 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552399 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552403 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552406 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552409 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.552411 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.552416 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.559142 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.559156 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559210 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559215 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559219 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559222 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559225 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559228 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559231 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.559512 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559234 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559237 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559240 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559244 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559247 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559249 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559252 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559255 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559257 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559260 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559262 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559266 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559269 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559272 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559275 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559277 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559280 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559283 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559286 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559289 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.559937 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559291 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559311 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559314 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559317 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559321 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559324 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559328 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559333 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559336 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559339 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559343 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559346 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559349 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559352 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559354 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559357 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559360 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559362 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559365 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.560448 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559367 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559370 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559372 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559375 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559378 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559381 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559383 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559386 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559389 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559391 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559394 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559396 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559399 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559401 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559404 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559407 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559409 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559412 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559415 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559418 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.560911 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559420 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559423 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559426 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559428 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559431 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559434 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559436 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559439 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559441 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559444 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559446 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559449 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559451 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559453 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559456 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559459 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559462 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559464 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559467 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.561395 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559469 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.559475 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559573 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559578 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559581 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559584 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559587 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559590 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559592 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559595 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559598 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559600 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559604 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559606 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559608 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.561861 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559611 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559613 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559616 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559618 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559621 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559623 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559627 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559631 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559635 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559639 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559642 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559645 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559648 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559651 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559653 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559656 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559659 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559662 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559664 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.562214 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559667 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559670 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559672 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559675 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559677 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559680 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559682 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559685 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559687 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559690 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559692 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559695 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559697 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559700 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559702 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559705 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559707 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559710 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559712 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559715 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.562689 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559717 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559720 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559722 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559725 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559727 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559730 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559732 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559735 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559738 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559740 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559743 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559746 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559748 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559751 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559753 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559755 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559758 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559760 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559763 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559765 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.563184 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559768 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559770 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559772 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559775 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559779 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559781 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559784 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559786 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559789 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559791 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559794 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559796 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559799 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:04.559801 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.559807 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 04:24:04.563692 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.560699 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 04:24:04.564060 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.563934 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 04:24:04.564978 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.564966 2578 server.go:1019] "Starting client certificate rotation" Apr 16 04:24:04.565073 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.565058 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 04:24:04.565105 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.565091 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 04:24:04.591673 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.591652 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 04:24:04.597214 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.597182 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 04:24:04.612350 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.612323 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 04:24:04.617989 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.617970 2578 log.go:25] "Validated CRI v1 image API" Apr 16 04:24:04.620501 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.620472 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 04:24:04.623469 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.623449 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 04:24:04.624792 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.624772 2578 fs.go:135] Filesystem UUIDs: map[2dbcb6a9-9deb-498f-bcee-023e9ad4483f:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 bb4736d9-5c36-4ec7-a1ba-daeaacc3a60b:/dev/nvme0n1p3] Apr 16 04:24:04.624849 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.624791 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 04:24:04.630717 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.630598 2578 manager.go:217] Machine: {Timestamp:2026-04-16 04:24:04.628600168 +0000 UTC m=+0.445140238 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095897 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bf09487f280194c4c34560a727ea1 SystemUUID:ec2bf094-87f2-8019-4c4c-34560a727ea1 BootID:84e2df59-4575-4bc7-9206-fe193a6cae3e Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fc:d8:9b:a6:dd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fc:d8:9b:a6:dd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f2:bd:a4:09:25:bb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 04:24:04.630717 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.630712 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 04:24:04.630830 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.630804 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 04:24:04.631784 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.631765 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 04:24:04.631940 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.631787 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-211.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 04:24:04.631981 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.631950 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 04:24:04.631981 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.631960 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 04:24:04.631981 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.631973 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 04:24:04.633944 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.633933 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 04:24:04.635898 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.635887 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 04:24:04.636005 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.635996 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 04:24:04.637863 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.637848 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m2zml" Apr 16 04:24:04.638708 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.638698 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 04:24:04.638746 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.638714 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 04:24:04.638746 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.638729 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 04:24:04.638746 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.638739 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 04:24:04.638843 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.638753 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 04:24:04.639935 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.639922 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 04:24:04.639995 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.639945 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 04:24:04.643380 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.643365 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 04:24:04.644587 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.644571 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m2zml" Apr 16 04:24:04.645655 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.645637 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 04:24:04.647491 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647468 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 04:24:04.647491 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647493 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647500 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647511 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647517 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647523 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647530 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647535 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647543 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647551 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647560 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 04:24:04.647629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.647569 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 04:24:04.648595 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.648580 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 04:24:04.648649 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.648607 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 04:24:04.652121 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.652105 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:04.652460 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.652442 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 04:24:04.652520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.652505 2578 server.go:1295] "Started kubelet" Apr 16 04:24:04.652597 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.652572 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 04:24:04.652952 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.652912 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 04:24:04.653027 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.652970 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 04:24:04.653366 ip-10-0-140-211 systemd[1]: Started Kubernetes Kubelet. Apr 16 04:24:04.654196 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.654120 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 04:24:04.654736 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.654721 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:04.655141 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.655129 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 04:24:04.658983 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.658967 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 04:24:04.659080 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.658989 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 04:24:04.659625 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.659608 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 04:24:04.659625 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.659611 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 04:24:04.659775 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.659640 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 04:24:04.659775 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.659677 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-211.ec2.internal" not found Apr 16 04:24:04.659775 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.659761 2578 factory.go:55] Registering systemd factory Apr 16 04:24:04.659914 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.659785 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 04:24:04.659914 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.659795 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 04:24:04.659914 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.659804 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 04:24:04.660050 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:04.659979 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-211.ec2.internal\" not found" Apr 16 04:24:04.660050 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.659997 2578 factory.go:153] Registering CRI-O factory Apr 16 04:24:04.660050 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.660008 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 04:24:04.660201 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.660058 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 04:24:04.660201 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.660083 2578 factory.go:103] Registering Raw factory Apr 16 04:24:04.660201 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.660096 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 04:24:04.660531 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.660516 2578 manager.go:319] Starting recovery of all containers Apr 16 04:24:04.661442 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.661418 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:04.662647 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:04.662625 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 04:24:04.668449 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:04.668343 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-211.ec2.internal\" not found" node="ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.668789 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.668748 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 04:24:04.671532 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.671507 2578 manager.go:324] Recovery completed Apr 16 04:24:04.677154 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.677045 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:04.678735 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.678719 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-211.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:04.678808 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.678747 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-211.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:04.678808 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.678757 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-211.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:04.679217 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.679203 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 04:24:04.679217 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.679216 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 04:24:04.679330 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.679231 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 04:24:04.681870 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.681858 2578 policy_none.go:49] "None policy: Start" Apr 16 04:24:04.681923 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.681875 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 04:24:04.681923 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.681885 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 04:24:04.685478 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.685462 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-211.ec2.internal" not found Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.725965 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:04.726005 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.726016 2578 server.go:85] "Starting device plugin registration server" Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.726274 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.726286 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.726415 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.726488 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.726497 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:04.727020 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 04:24:04.727861 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:04.727055 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-211.ec2.internal\" not found" Apr 16 04:24:04.748678 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.748655 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-211.ec2.internal" not found Apr 16 04:24:04.786419 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.786396 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 04:24:04.786532 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.786433 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 04:24:04.786532 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.786453 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 04:24:04.786532 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.786460 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 04:24:04.786532 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:04.786490 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 04:24:04.788234 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.788218 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:04.827262 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.827189 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:04.828230 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.828214 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-211.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:04.828334 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.828245 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-211.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:04.828334 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.828263 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-211.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:04.828334 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.828287 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.836278 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.836262 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.887092 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.887053 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal"] Apr 16 04:24:04.891984 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.891969 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.892071 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.891969 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.909813 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.909797 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.914075 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.914062 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.928334 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.928315 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 04:24:04.961716 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.961693 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1a60e730c1a02bac81e499c95b0d4aa1-config\") pod \"kube-apiserver-proxy-ip-10-0-140-211.ec2.internal\" (UID: \"1a60e730c1a02bac81e499c95b0d4aa1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.961804 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.961722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5592b42208bf0b26cc8688ad52a01ff0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal\" (UID: \"5592b42208bf0b26cc8688ad52a01ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.961804 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.961746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5592b42208bf0b26cc8688ad52a01ff0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal\" (UID: \"5592b42208bf0b26cc8688ad52a01ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" Apr 16 04:24:04.964870 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:04.964853 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 04:24:05.062309 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.062269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1a60e730c1a02bac81e499c95b0d4aa1-config\") pod \"kube-apiserver-proxy-ip-10-0-140-211.ec2.internal\" (UID: \"1a60e730c1a02bac81e499c95b0d4aa1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal" Apr 16 04:24:05.062309 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.062309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5592b42208bf0b26cc8688ad52a01ff0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal\" (UID: \"5592b42208bf0b26cc8688ad52a01ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" Apr 16 04:24:05.062506 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.062326 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5592b42208bf0b26cc8688ad52a01ff0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal\" (UID: \"5592b42208bf0b26cc8688ad52a01ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" Apr 16 04:24:05.062506 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.062359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5592b42208bf0b26cc8688ad52a01ff0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal\" (UID: \"5592b42208bf0b26cc8688ad52a01ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" Apr 16 04:24:05.062506 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.062367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5592b42208bf0b26cc8688ad52a01ff0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal\" (UID: \"5592b42208bf0b26cc8688ad52a01ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" Apr 16 04:24:05.062506 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.062427 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1a60e730c1a02bac81e499c95b0d4aa1-config\") pod \"kube-apiserver-proxy-ip-10-0-140-211.ec2.internal\" (UID: \"1a60e730c1a02bac81e499c95b0d4aa1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal" Apr 16 04:24:05.232611 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.232584 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal" Apr 16 04:24:05.268196 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.268172 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" Apr 16 04:24:05.564518 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.564421 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 04:24:05.565253 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.564574 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 04:24:05.565253 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.564578 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 04:24:05.565253 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.564616 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 04:24:05.639207 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.639176 2578 apiserver.go:52] "Watching apiserver" Apr 16 04:24:05.647168 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.647128 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 04:19:04 +0000 UTC" deadline="2027-11-01 10:52:38.839974437 +0000 UTC" Apr 16 04:24:05.647168 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.647166 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13542h28m33.192812544s" Apr 16 04:24:05.647448 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.647286 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 04:24:05.647663 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.647643 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-s72p4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal","openshift-multus/multus-additional-cni-plugins-btwn7","openshift-multus/multus-rk4kv","openshift-multus/network-metrics-daemon-qhcj5","kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl","openshift-image-registry/node-ca-ctskn","openshift-network-diagnostics/network-check-target-mnzb4","openshift-network-operator/iptables-alerter-w4n2g","openshift-ovn-kubernetes/ovnkube-node-7vmzk","kube-system/konnectivity-agent-97kqm","openshift-cluster-node-tuning-operator/tuned-ws575"] Apr 16 04:24:05.651986 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.651964 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.654004 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.653984 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.654098 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.654083 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:05.654598 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:05.654460 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:05.655496 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.655475 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:24:05.655681 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.655664 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 04:24:05.656644 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.655950 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 04:24:05.656644 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.656102 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n882s\"" Apr 16 04:24:05.656866 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.656833 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.657273 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.657126 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 04:24:05.657273 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.657228 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 04:24:05.657600 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.657579 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 04:24:05.657680 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.657593 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 04:24:05.657751 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.657692 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mt6hr\"" Apr 16 04:24:05.659115 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.659100 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 04:24:05.659311 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.659266 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 04:24:05.659408 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.659395 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4kjlp\"" Apr 16 04:24:05.659471 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.659418 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.659515 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.659476 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 04:24:05.659687 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.659670 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 04:24:05.661755 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.661594 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 04:24:05.661755 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.661631 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rg4l9\"" Apr 16 04:24:05.661755 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.661720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.661939 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.661796 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 04:24:05.663873 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.663845 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.664541 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6630844f-3950-46f3-b23d-355ceec908ec-hosts-file\") pod \"node-resolver-s72p4\" (UID: \"6630844f-3950-46f3-b23d-355ceec908ec\") " pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.664634 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-os-release\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.664634 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-var-lib-cni-multus\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.664634 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-cnibin\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.664779 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664639 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-tuning-conf-dir\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.664779 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-os-release\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.664779 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6effaf82-47fd-4e1e-904a-407184af8a9d-cni-binary-copy\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.664779 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e9738669-31c5-4a1c-8e9b-4f0691464165-serviceca\") pod \"node-ca-ctskn\" (UID: \"e9738669-31c5-4a1c-8e9b-4f0691464165\") " pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.664779 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bfcb\" (UniqueName: \"kubernetes.io/projected/6630844f-3950-46f3-b23d-355ceec908ec-kube-api-access-2bfcb\") pod \"node-resolver-s72p4\" (UID: \"6630844f-3950-46f3-b23d-355ceec908ec\") " pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.664975 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664784 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf878\" (UniqueName: \"kubernetes.io/projected/97c67a71-c877-4bde-ae72-3f9915f6cb95-kube-api-access-tf878\") pod \"iptables-alerter-w4n2g\" (UID: \"97c67a71-c877-4bde-ae72-3f9915f6cb95\") " pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.664975 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664815 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-cni-dir\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.664975 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-run-netns\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.664975 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-hostroot\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.664975 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664905 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-daemon-config\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.664975 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-etc-kubernetes\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.665332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.664986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-system-cni-dir\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.665332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0763391d-17aa-4fb0-a753-c705589537ab-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.665332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vs6\" (UniqueName: \"kubernetes.io/projected/0763391d-17aa-4fb0-a753-c705589537ab-kube-api-access-k4vs6\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.665332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665120 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97c67a71-c877-4bde-ae72-3f9915f6cb95-host-slash\") pod \"iptables-alerter-w4n2g\" (UID: \"97c67a71-c877-4bde-ae72-3f9915f6cb95\") " pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.665332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-socket-dir-parent\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.665332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665150 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:05.665332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665192 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9738669-31c5-4a1c-8e9b-4f0691464165-host\") pod \"node-ca-ctskn\" (UID: \"e9738669-31c5-4a1c-8e9b-4f0691464165\") " pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.665332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxqmz\" (UniqueName: \"kubernetes.io/projected/e9738669-31c5-4a1c-8e9b-4f0691464165-kube-api-access-rxqmz\") pod \"node-ca-ctskn\" (UID: \"e9738669-31c5-4a1c-8e9b-4f0691464165\") " pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.665332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk9zf\" (UniqueName: \"kubernetes.io/projected/6effaf82-47fd-4e1e-904a-407184af8a9d-kube-api-access-dk9zf\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665337 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pwrz\" (UniqueName: \"kubernetes.io/projected/b30a87b4-65a2-4504-be52-b10fb247dedb-kube-api-access-4pwrz\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/97c67a71-c877-4bde-ae72-3f9915f6cb95-iptables-alerter-script\") pod \"iptables-alerter-w4n2g\" (UID: \"97c67a71-c877-4bde-ae72-3f9915f6cb95\") " pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665404 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-var-lib-cni-bin\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6630844f-3950-46f3-b23d-355ceec908ec-tmp-dir\") pod \"node-resolver-s72p4\" (UID: \"6630844f-3950-46f3-b23d-355ceec908ec\") " pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665459 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0763391d-17aa-4fb0-a753-c705589537ab-cni-binary-copy\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665503 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-bhlg2\"" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665528 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0763391d-17aa-4fb0-a753-c705589537ab-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665592 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-cnibin\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-system-cni-dir\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-run-k8s-cni-cncf-io\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665656 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-var-lib-kubelet\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-conf-dir\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.665758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.665697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-run-multus-certs\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.666429 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.666168 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:05.666429 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:05.666231 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:05.666429 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.666350 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2l45v\"" Apr 16 04:24:05.667133 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.666608 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 04:24:05.667133 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.666893 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 04:24:05.667133 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.666894 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 04:24:05.671142 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.671123 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.671420 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.671402 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:05.673348 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.673331 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 04:24:05.673425 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.673394 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.676218 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.676203 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 04:24:05.676465 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.676441 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 04:24:05.676536 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.676496 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 04:24:05.677736 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.677692 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 04:24:05.677736 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.677709 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 04:24:05.677892 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.677746 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t579d\"" Apr 16 04:24:05.677892 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.677751 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:24:05.677892 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.677764 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qxqfm\"" Apr 16 04:24:05.678249 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.678227 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 04:24:05.678764 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.678747 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 04:24:05.678867 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.678797 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9r2xw\"" Apr 16 04:24:05.678867 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.678805 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 04:24:05.678989 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.678871 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 04:24:05.708496 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.708474 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xbjn6" Apr 16 04:24:05.717715 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.717697 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xbjn6" Apr 16 04:24:05.760782 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.760762 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 04:24:05.766483 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:05.766564 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766492 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-slash\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.766564 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.766564 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-host\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.766564 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk9zf\" (UniqueName: \"kubernetes.io/projected/6effaf82-47fd-4e1e-904a-407184af8a9d-kube-api-access-dk9zf\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.766682 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-cni-netd\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.766682 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:05.766637 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:05.766736 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-ovn-node-metrics-cert\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.766769 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:05.766758 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs podName:b30a87b4-65a2-4504-be52-b10fb247dedb nodeName:}" failed. No retries permitted until 2026-04-16 04:24:06.266734438 +0000 UTC m=+2.083274515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs") pod "network-metrics-daemon-qhcj5" (UID: "b30a87b4-65a2-4504-be52-b10fb247dedb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:05.766843 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-var-lib-cni-bin\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.766843 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-systemd-units\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.766916 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-sysctl-d\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.766916 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0763391d-17aa-4fb0-a753-c705589537ab-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.767021 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766911 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-var-lib-cni-bin\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767021 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-cnibin\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767021 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766966 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-kubelet\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.767021 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.766991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfpfd\" (UniqueName: \"kubernetes.io/projected/9a43d074-9490-4217-acb3-18783e647e75-kube-api-access-mfpfd\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.767021 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-cnibin\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767021 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-system-cni-dir\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-run-k8s-cni-cncf-io\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-system-cni-dir\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-var-lib-kubelet\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-run-k8s-cni-cncf-io\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-conf-dir\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-var-lib-kubelet\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767131 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-run-systemd\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-conf-dir\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-run-ovn\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-log-socket\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a43d074-9490-4217-acb3-18783e647e75-tmp\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6630844f-3950-46f3-b23d-355ceec908ec-hosts-file\") pod \"node-resolver-s72p4\" (UID: \"6630844f-3950-46f3-b23d-355ceec908ec\") " pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.767259 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-os-release\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-os-release\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-var-lib-cni-multus\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6630844f-3950-46f3-b23d-355ceec908ec-hosts-file\") pod \"node-resolver-s72p4\" (UID: \"6630844f-3950-46f3-b23d-355ceec908ec\") " pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-lib-modules\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-var-lib-cni-multus\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-var-lib-kubelet\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-cnibin\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-cnibin\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-os-release\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767526 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0763391d-17aa-4fb0-a753-c705589537ab-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6effaf82-47fd-4e1e-904a-407184af8a9d-cni-binary-copy\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e9738669-31c5-4a1c-8e9b-4f0691464165-serviceca\") pod \"node-ca-ctskn\" (UID: \"e9738669-31c5-4a1c-8e9b-4f0691464165\") " pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-sys-fs\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-run-ovn-kubernetes\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767615 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-os-release\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-ovnkube-script-lib\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.767806 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-run\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf878\" (UniqueName: \"kubernetes.io/projected/97c67a71-c877-4bde-ae72-3f9915f6cb95-kube-api-access-tf878\") pod \"iptables-alerter-w4n2g\" (UID: \"97c67a71-c877-4bde-ae72-3f9915f6cb95\") " pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-run-netns\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-hostroot\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-daemon-config\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-hostroot\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4gr\" (UniqueName: \"kubernetes.io/projected/25372649-86ad-45ac-ae3f-a871b8e3caa0-kube-api-access-zf4gr\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzmqf\" (UniqueName: \"kubernetes.io/projected/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-kube-api-access-mzmqf\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767943 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-run-netns\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.767974 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-systemd\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e9738669-31c5-4a1c-8e9b-4f0691464165-serviceca\") pod \"node-ca-ctskn\" (UID: \"e9738669-31c5-4a1c-8e9b-4f0691464165\") " pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-system-cni-dir\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0763391d-17aa-4fb0-a753-c705589537ab-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vs6\" (UniqueName: \"kubernetes.io/projected/0763391d-17aa-4fb0-a753-c705589537ab-kube-api-access-k4vs6\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768106 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-system-cni-dir\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97c67a71-c877-4bde-ae72-3f9915f6cb95-host-slash\") pod \"iptables-alerter-w4n2g\" (UID: \"97c67a71-c877-4bde-ae72-3f9915f6cb95\") " pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6effaf82-47fd-4e1e-904a-407184af8a9d-cni-binary-copy\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.768647 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97c67a71-c877-4bde-ae72-3f9915f6cb95-host-slash\") pod \"iptables-alerter-w4n2g\" (UID: \"97c67a71-c877-4bde-ae72-3f9915f6cb95\") " pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-socket-dir-parent\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768221 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9738669-31c5-4a1c-8e9b-4f0691464165-host\") pod \"node-ca-ctskn\" (UID: \"e9738669-31c5-4a1c-8e9b-4f0691464165\") " pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxqmz\" (UniqueName: \"kubernetes.io/projected/e9738669-31c5-4a1c-8e9b-4f0691464165-kube-api-access-rxqmz\") pod \"node-ca-ctskn\" (UID: \"e9738669-31c5-4a1c-8e9b-4f0691464165\") " pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768274 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-run-netns\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9738669-31c5-4a1c-8e9b-4f0691464165-host\") pod \"node-ca-ctskn\" (UID: \"e9738669-31c5-4a1c-8e9b-4f0691464165\") " pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pwrz\" (UniqueName: \"kubernetes.io/projected/b30a87b4-65a2-4504-be52-b10fb247dedb-kube-api-access-4pwrz\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768341 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-socket-dir-parent\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768359 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-device-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-daemon-config\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768384 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-var-lib-openvswitch\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768425 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-run-openvswitch\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-node-log\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-ovnkube-config\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768501 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dd90ea8e-03cf-460f-8525-c28183fc3a33-agent-certs\") pod \"konnectivity-agent-97kqm\" (UID: \"dd90ea8e-03cf-460f-8525-c28183fc3a33\") " pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-kubernetes\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/97c67a71-c877-4bde-ae72-3f9915f6cb95-iptables-alerter-script\") pod \"iptables-alerter-w4n2g\" (UID: \"97c67a71-c877-4bde-ae72-3f9915f6cb95\") " pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.769324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0763391d-17aa-4fb0-a753-c705589537ab-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-registration-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6630844f-3950-46f3-b23d-355ceec908ec-tmp-dir\") pod \"node-resolver-s72p4\" (UID: \"6630844f-3950-46f3-b23d-355ceec908ec\") " pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0763391d-17aa-4fb0-a753-c705589537ab-cni-binary-copy\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768652 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-env-overrides\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-sysctl-conf\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a43d074-9490-4217-acb3-18783e647e75-etc-tuned\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-run-multus-certs\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-etc-selinux\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768755 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-etc-openvswitch\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-sysconfig\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768797 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-sys\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-tuning-conf-dir\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768839 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-socket-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.769945 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768877 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dd90ea8e-03cf-460f-8525-c28183fc3a33-konnectivity-ca\") pod \"konnectivity-agent-97kqm\" (UID: \"dd90ea8e-03cf-460f-8525-c28183fc3a33\") " pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768905 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-modprobe-d\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768906 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6630844f-3950-46f3-b23d-355ceec908ec-tmp-dir\") pod \"node-resolver-s72p4\" (UID: \"6630844f-3950-46f3-b23d-355ceec908ec\") " pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bfcb\" (UniqueName: \"kubernetes.io/projected/6630844f-3950-46f3-b23d-355ceec908ec-kube-api-access-2bfcb\") pod \"node-resolver-s72p4\" (UID: \"6630844f-3950-46f3-b23d-355ceec908ec\") " pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-cni-dir\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-etc-kubernetes\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.768971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-cni-bin\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.769041 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-host-run-multus-certs\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.769092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-multus-cni-dir\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.769113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/97c67a71-c877-4bde-ae72-3f9915f6cb95-iptables-alerter-script\") pod \"iptables-alerter-w4n2g\" (UID: \"97c67a71-c877-4bde-ae72-3f9915f6cb95\") " pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.769173 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6effaf82-47fd-4e1e-904a-407184af8a9d-etc-kubernetes\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.769651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0763391d-17aa-4fb0-a753-c705589537ab-tuning-conf-dir\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.770514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.769755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0763391d-17aa-4fb0-a753-c705589537ab-cni-binary-copy\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.772759 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.772739 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 04:24:05.776361 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.776344 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk9zf\" (UniqueName: \"kubernetes.io/projected/6effaf82-47fd-4e1e-904a-407184af8a9d-kube-api-access-dk9zf\") pod \"multus-rk4kv\" (UID: \"6effaf82-47fd-4e1e-904a-407184af8a9d\") " pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.776465 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.776373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pwrz\" (UniqueName: \"kubernetes.io/projected/b30a87b4-65a2-4504-be52-b10fb247dedb-kube-api-access-4pwrz\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:05.776465 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.776385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxqmz\" (UniqueName: \"kubernetes.io/projected/e9738669-31c5-4a1c-8e9b-4f0691464165-kube-api-access-rxqmz\") pod \"node-ca-ctskn\" (UID: \"e9738669-31c5-4a1c-8e9b-4f0691464165\") " pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:05.776581 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.776561 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vs6\" (UniqueName: \"kubernetes.io/projected/0763391d-17aa-4fb0-a753-c705589537ab-kube-api-access-k4vs6\") pod \"multus-additional-cni-plugins-btwn7\" (UID: \"0763391d-17aa-4fb0-a753-c705589537ab\") " pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:05.776636 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.776597 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf878\" (UniqueName: \"kubernetes.io/projected/97c67a71-c877-4bde-ae72-3f9915f6cb95-kube-api-access-tf878\") pod \"iptables-alerter-w4n2g\" (UID: \"97c67a71-c877-4bde-ae72-3f9915f6cb95\") " pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.777433 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.777414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bfcb\" (UniqueName: \"kubernetes.io/projected/6630844f-3950-46f3-b23d-355ceec908ec-kube-api-access-2bfcb\") pod \"node-resolver-s72p4\" (UID: \"6630844f-3950-46f3-b23d-355ceec908ec\") " pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:05.780469 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:05.780445 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5592b42208bf0b26cc8688ad52a01ff0.slice/crio-37473449ba05d0f40528f1713ad1d2d1ba606cefc681cd6d7722c48aae3b1bfc WatchSource:0}: Error finding container 37473449ba05d0f40528f1713ad1d2d1ba606cefc681cd6d7722c48aae3b1bfc: Status 404 returned error can't find the container with id 37473449ba05d0f40528f1713ad1d2d1ba606cefc681cd6d7722c48aae3b1bfc Apr 16 04:24:05.781287 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:05.781269 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a60e730c1a02bac81e499c95b0d4aa1.slice/crio-ebab2c18ccf9ef69237f77eb89b6e6ee1a18455dc1c95488c81a2bcf8cf4aef9 WatchSource:0}: Error finding container ebab2c18ccf9ef69237f77eb89b6e6ee1a18455dc1c95488c81a2bcf8cf4aef9: Status 404 returned error can't find the container with id ebab2c18ccf9ef69237f77eb89b6e6ee1a18455dc1c95488c81a2bcf8cf4aef9 Apr 16 04:24:05.785001 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.784984 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:24:05.790698 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.790606 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal" event={"ID":"1a60e730c1a02bac81e499c95b0d4aa1","Type":"ContainerStarted","Data":"ebab2c18ccf9ef69237f77eb89b6e6ee1a18455dc1c95488c81a2bcf8cf4aef9"} Apr 16 04:24:05.793409 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.793389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" event={"ID":"5592b42208bf0b26cc8688ad52a01ff0","Type":"ContainerStarted","Data":"37473449ba05d0f40528f1713ad1d2d1ba606cefc681cd6d7722c48aae3b1bfc"} Apr 16 04:24:05.869927 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.869847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-socket-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.869927 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.869879 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dd90ea8e-03cf-460f-8525-c28183fc3a33-konnectivity-ca\") pod \"konnectivity-agent-97kqm\" (UID: \"dd90ea8e-03cf-460f-8525-c28183fc3a33\") " pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:05.869927 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.869895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-modprobe-d\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.869927 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.869919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-cni-bin\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.869964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-cni-bin\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-slash\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-modprobe-d\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-socket-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870067 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-slash\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-host\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870084 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-host\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870110 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-cni-netd\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-ovn-node-metrics-cert\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-systemd-units\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870170 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-sysctl-d\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-systemd-units\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-kubelet\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870222 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfpfd\" (UniqueName: \"kubernetes.io/projected/9a43d074-9490-4217-acb3-18783e647e75-kube-api-access-mfpfd\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-run-systemd\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-sysctl-d\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870271 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-run-ovn\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-log-socket\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870318 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-cni-netd\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-run-systemd\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870353 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-log-socket\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-kubelet\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870353 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-run-ovn\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870379 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a43d074-9490-4217-acb3-18783e647e75-tmp\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-lib-modules\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-var-lib-kubelet\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-sys-fs\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870489 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dd90ea8e-03cf-460f-8525-c28183fc3a33-konnectivity-ca\") pod \"konnectivity-agent-97kqm\" (UID: \"dd90ea8e-03cf-460f-8525-c28183fc3a33\") " pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-run-ovn-kubernetes\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870513 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-var-lib-kubelet\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870526 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-lib-modules\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.870931 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-run-ovn-kubernetes\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-sys-fs\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870589 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-ovnkube-script-lib\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870618 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-run\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4gr\" (UniqueName: \"kubernetes.io/projected/25372649-86ad-45ac-ae3f-a871b8e3caa0-kube-api-access-zf4gr\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzmqf\" (UniqueName: \"kubernetes.io/projected/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-kube-api-access-mzmqf\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-run\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870707 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-systemd\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-run-netns\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-device-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-var-lib-openvswitch\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870826 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-systemd\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-run-openvswitch\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-device-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-node-log\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-host-run-netns\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-ovnkube-config\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.871738 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dd90ea8e-03cf-460f-8525-c28183fc3a33-agent-certs\") pod \"konnectivity-agent-97kqm\" (UID: \"dd90ea8e-03cf-460f-8525-c28183fc3a33\") " pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870956 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-var-lib-openvswitch\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870966 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-kubernetes\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-registration-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-run-openvswitch\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-env-overrides\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-sysctl-conf\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a43d074-9490-4217-acb3-18783e647e75-etc-tuned\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-etc-selinux\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.870923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-node-log\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-etc-openvswitch\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-ovnkube-script-lib\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871234 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-kubernetes\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871250 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-registration-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871361 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-etc-selinux\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-ovnkube-config\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.872267 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-sysctl-conf\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.872752 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871521 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-etc-openvswitch\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.872752 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-env-overrides\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.872752 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-sysconfig\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.872752 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-sys\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.872752 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.872752 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-etc-sysconfig\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.872752 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a43d074-9490-4217-acb3-18783e647e75-sys\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.872752 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.871755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25372649-86ad-45ac-ae3f-a871b8e3caa0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.873014 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.872994 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a43d074-9490-4217-acb3-18783e647e75-tmp\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.873336 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.873318 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-ovn-node-metrics-cert\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.873412 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.873346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a43d074-9490-4217-acb3-18783e647e75-etc-tuned\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.873412 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.873325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dd90ea8e-03cf-460f-8525-c28183fc3a33-agent-certs\") pod \"konnectivity-agent-97kqm\" (UID: \"dd90ea8e-03cf-460f-8525-c28183fc3a33\") " pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:05.876044 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:05.876024 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:05.876116 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:05.876048 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:05.876116 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:05.876058 2578 projected.go:194] Error preparing data for projected volume kube-api-access-n6m5d for pod openshift-network-diagnostics/network-check-target-mnzb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:05.876222 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:05.876118 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d podName:d17e7eff-b5d0-404c-bf53-695801a18097 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:06.376100643 +0000 UTC m=+2.192640711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n6m5d" (UniqueName: "kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d") pod "network-check-target-mnzb4" (UID: "d17e7eff-b5d0-404c-bf53-695801a18097") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:05.877759 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.877743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzmqf\" (UniqueName: \"kubernetes.io/projected/b57f188f-8b5d-4bf1-9ab4-39e808fa255e-kube-api-access-mzmqf\") pod \"ovnkube-node-7vmzk\" (UID: \"b57f188f-8b5d-4bf1-9ab4-39e808fa255e\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:05.878062 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.878044 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfpfd\" (UniqueName: \"kubernetes.io/projected/9a43d074-9490-4217-acb3-18783e647e75-kube-api-access-mfpfd\") pod \"tuned-ws575\" (UID: \"9a43d074-9490-4217-acb3-18783e647e75\") " pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:05.878121 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.878068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4gr\" (UniqueName: \"kubernetes.io/projected/25372649-86ad-45ac-ae3f-a871b8e3caa0-kube-api-access-zf4gr\") pod \"aws-ebs-csi-driver-node-hg5kl\" (UID: \"25372649-86ad-45ac-ae3f-a871b8e3caa0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:05.982673 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.982645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w4n2g" Apr 16 04:24:05.988710 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:05.988687 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c67a71_c877_4bde_ae72_3f9915f6cb95.slice/crio-d1a4e383c886e3b713fe1a4305c8ef4b85aaa69f5f021fb349941a28aa513729 WatchSource:0}: Error finding container d1a4e383c886e3b713fe1a4305c8ef4b85aaa69f5f021fb349941a28aa513729: Status 404 returned error can't find the container with id d1a4e383c886e3b713fe1a4305c8ef4b85aaa69f5f021fb349941a28aa513729 Apr 16 04:24:05.994629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:05.994608 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rk4kv" Apr 16 04:24:05.999917 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:05.999895 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6effaf82_47fd_4e1e_904a_407184af8a9d.slice/crio-679e33f1d36ae881b57d73566c3968c9f1be18d11980787676fdb6b2ac4e4ed3 WatchSource:0}: Error finding container 679e33f1d36ae881b57d73566c3968c9f1be18d11980787676fdb6b2ac4e4ed3: Status 404 returned error can't find the container with id 679e33f1d36ae881b57d73566c3968c9f1be18d11980787676fdb6b2ac4e4ed3 Apr 16 04:24:06.008972 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.008952 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ctskn" Apr 16 04:24:06.013433 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.013416 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s72p4" Apr 16 04:24:06.015272 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:06.015253 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9738669_31c5_4a1c_8e9b_4f0691464165.slice/crio-a92221a2eb57fd3cadd79dae16848ec8b99a8083f0201c5b357346dd419707d6 WatchSource:0}: Error finding container a92221a2eb57fd3cadd79dae16848ec8b99a8083f0201c5b357346dd419707d6: Status 404 returned error can't find the container with id a92221a2eb57fd3cadd79dae16848ec8b99a8083f0201c5b357346dd419707d6 Apr 16 04:24:06.020839 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:06.020820 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6630844f_3950_46f3_b23d_355ceec908ec.slice/crio-913ff83c078cbf6ca91a01f2153cbc4abfe7ae9c5eeb58596b464c29159ccece WatchSource:0}: Error finding container 913ff83c078cbf6ca91a01f2153cbc4abfe7ae9c5eeb58596b464c29159ccece: Status 404 returned error can't find the container with id 913ff83c078cbf6ca91a01f2153cbc4abfe7ae9c5eeb58596b464c29159ccece Apr 16 04:24:06.024325 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.024309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-btwn7" Apr 16 04:24:06.030259 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:06.030238 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0763391d_17aa_4fb0_a753_c705589537ab.slice/crio-b5ec54aff0fc80d7b88d2262bc3163f394becc64c340d2bbdf761971af9b50db WatchSource:0}: Error finding container b5ec54aff0fc80d7b88d2262bc3163f394becc64c340d2bbdf761971af9b50db: Status 404 returned error can't find the container with id b5ec54aff0fc80d7b88d2262bc3163f394becc64c340d2bbdf761971af9b50db Apr 16 04:24:06.040603 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.040587 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" Apr 16 04:24:06.046407 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:06.046386 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25372649_86ad_45ac_ae3f_a871b8e3caa0.slice/crio-8281064294d38613ef76d4dbc931af355c5740a850e06c2f8d26426d6a97dcb3 WatchSource:0}: Error finding container 8281064294d38613ef76d4dbc931af355c5740a850e06c2f8d26426d6a97dcb3: Status 404 returned error can't find the container with id 8281064294d38613ef76d4dbc931af355c5740a850e06c2f8d26426d6a97dcb3 Apr 16 04:24:06.058010 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.057993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:06.064430 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:06.064412 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb57f188f_8b5d_4bf1_9ab4_39e808fa255e.slice/crio-5bbdb322fa2e42ae3682b6203d4701a1148ed48b7316f7e496132861bbbf4f3a WatchSource:0}: Error finding container 5bbdb322fa2e42ae3682b6203d4701a1148ed48b7316f7e496132861bbbf4f3a: Status 404 returned error can't find the container with id 5bbdb322fa2e42ae3682b6203d4701a1148ed48b7316f7e496132861bbbf4f3a Apr 16 04:24:06.077636 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.077618 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:06.083196 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:06.083176 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd90ea8e_03cf_460f_8525_c28183fc3a33.slice/crio-c45b935d2f9d788d0d8e4bd3007002765bd82e9aa6d2de91c9d48fffdc9c190e WatchSource:0}: Error finding container c45b935d2f9d788d0d8e4bd3007002765bd82e9aa6d2de91c9d48fffdc9c190e: Status 404 returned error can't find the container with id c45b935d2f9d788d0d8e4bd3007002765bd82e9aa6d2de91c9d48fffdc9c190e Apr 16 04:24:06.093076 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.093060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ws575" Apr 16 04:24:06.098078 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:06.098057 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a43d074_9490_4217_acb3_18783e647e75.slice/crio-be019b53b74e103b93123799601fed3017e06ec9148a4d6de891bf2a6431c6ed WatchSource:0}: Error finding container be019b53b74e103b93123799601fed3017e06ec9148a4d6de891bf2a6431c6ed: Status 404 returned error can't find the container with id be019b53b74e103b93123799601fed3017e06ec9148a4d6de891bf2a6431c6ed Apr 16 04:24:06.276226 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.276189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:06.277322 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:06.277121 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:06.277322 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:06.277217 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs podName:b30a87b4-65a2-4504-be52-b10fb247dedb nodeName:}" failed. No retries permitted until 2026-04-16 04:24:07.277196092 +0000 UTC m=+3.093736152 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs") pod "network-metrics-daemon-qhcj5" (UID: "b30a87b4-65a2-4504-be52-b10fb247dedb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:06.377502 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.377259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:06.377502 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:06.377441 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:06.377502 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:06.377459 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:06.377502 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:06.377472 2578 projected.go:194] Error preparing data for projected volume kube-api-access-n6m5d for pod openshift-network-diagnostics/network-check-target-mnzb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:06.377821 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:06.377570 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d podName:d17e7eff-b5d0-404c-bf53-695801a18097 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:07.377550769 +0000 UTC m=+3.194090827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-n6m5d" (UniqueName: "kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d") pod "network-check-target-mnzb4" (UID: "d17e7eff-b5d0-404c-bf53-695801a18097") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:06.719283 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.719162 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 04:19:05 +0000 UTC" deadline="2028-01-07 04:08:22.967326036 +0000 UTC" Apr 16 04:24:06.719283 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.719201 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15143h44m16.248129015s" Apr 16 04:24:06.788492 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.788461 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:06.788654 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:06.788589 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:06.815049 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.815010 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" event={"ID":"25372649-86ad-45ac-ae3f-a871b8e3caa0","Type":"ContainerStarted","Data":"8281064294d38613ef76d4dbc931af355c5740a850e06c2f8d26426d6a97dcb3"} Apr 16 04:24:06.823376 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.821977 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-btwn7" event={"ID":"0763391d-17aa-4fb0-a753-c705589537ab","Type":"ContainerStarted","Data":"b5ec54aff0fc80d7b88d2262bc3163f394becc64c340d2bbdf761971af9b50db"} Apr 16 04:24:06.825078 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.825022 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ctskn" event={"ID":"e9738669-31c5-4a1c-8e9b-4f0691464165","Type":"ContainerStarted","Data":"a92221a2eb57fd3cadd79dae16848ec8b99a8083f0201c5b357346dd419707d6"} Apr 16 04:24:06.849864 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.849827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rk4kv" event={"ID":"6effaf82-47fd-4e1e-904a-407184af8a9d","Type":"ContainerStarted","Data":"679e33f1d36ae881b57d73566c3968c9f1be18d11980787676fdb6b2ac4e4ed3"} Apr 16 04:24:06.858665 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.858627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w4n2g" event={"ID":"97c67a71-c877-4bde-ae72-3f9915f6cb95","Type":"ContainerStarted","Data":"d1a4e383c886e3b713fe1a4305c8ef4b85aaa69f5f021fb349941a28aa513729"} Apr 16 04:24:06.874471 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.874438 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ws575" event={"ID":"9a43d074-9490-4217-acb3-18783e647e75","Type":"ContainerStarted","Data":"be019b53b74e103b93123799601fed3017e06ec9148a4d6de891bf2a6431c6ed"} Apr 16 04:24:06.890076 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.890043 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s72p4" event={"ID":"6630844f-3950-46f3-b23d-355ceec908ec","Type":"ContainerStarted","Data":"913ff83c078cbf6ca91a01f2153cbc4abfe7ae9c5eeb58596b464c29159ccece"} Apr 16 04:24:06.908117 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.908080 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-97kqm" event={"ID":"dd90ea8e-03cf-460f-8525-c28183fc3a33","Type":"ContainerStarted","Data":"c45b935d2f9d788d0d8e4bd3007002765bd82e9aa6d2de91c9d48fffdc9c190e"} Apr 16 04:24:06.928431 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.928400 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerStarted","Data":"5bbdb322fa2e42ae3682b6203d4701a1148ed48b7316f7e496132861bbbf4f3a"} Apr 16 04:24:06.957762 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.957730 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:06.988776 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:06.988748 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:07.162013 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:07.161984 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:07.287591 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:07.287509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:07.287751 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:07.287671 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:07.287751 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:07.287741 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs podName:b30a87b4-65a2-4504-be52-b10fb247dedb nodeName:}" failed. No retries permitted until 2026-04-16 04:24:09.287722948 +0000 UTC m=+5.104263023 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs") pod "network-metrics-daemon-qhcj5" (UID: "b30a87b4-65a2-4504-be52-b10fb247dedb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:07.388464 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:07.388427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:07.388623 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:07.388593 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:07.388623 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:07.388612 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:07.388623 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:07.388624 2578 projected.go:194] Error preparing data for projected volume kube-api-access-n6m5d for pod openshift-network-diagnostics/network-check-target-mnzb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:07.388769 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:07.388680 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d podName:d17e7eff-b5d0-404c-bf53-695801a18097 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:09.388660707 +0000 UTC m=+5.205200782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-n6m5d" (UniqueName: "kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d") pod "network-check-target-mnzb4" (UID: "d17e7eff-b5d0-404c-bf53-695801a18097") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:07.720407 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:07.720310 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 04:19:05 +0000 UTC" deadline="2028-01-15 15:39:44.896418882 +0000 UTC" Apr 16 04:24:07.720407 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:07.720347 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15347h15m37.176075187s" Apr 16 04:24:07.789050 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:07.788550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:07.789050 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:07.788692 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:08.789466 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:08.789426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:08.789914 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:08.789544 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:09.306378 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:09.305747 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:09.306378 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:09.305969 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:09.306378 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:09.306032 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs podName:b30a87b4-65a2-4504-be52-b10fb247dedb nodeName:}" failed. No retries permitted until 2026-04-16 04:24:13.306012508 +0000 UTC m=+9.122552567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs") pod "network-metrics-daemon-qhcj5" (UID: "b30a87b4-65a2-4504-be52-b10fb247dedb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:09.407208 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:09.406615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:09.407208 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:09.406785 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:09.407208 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:09.406806 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:09.407208 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:09.406818 2578 projected.go:194] Error preparing data for projected volume kube-api-access-n6m5d for pod openshift-network-diagnostics/network-check-target-mnzb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:09.407208 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:09.406883 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d podName:d17e7eff-b5d0-404c-bf53-695801a18097 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:13.406863096 +0000 UTC m=+9.223403162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-n6m5d" (UniqueName: "kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d") pod "network-check-target-mnzb4" (UID: "d17e7eff-b5d0-404c-bf53-695801a18097") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:09.787402 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:09.787372 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:09.787561 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:09.787505 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:10.787488 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:10.787448 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:10.787939 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:10.787584 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:11.787723 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:11.787686 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:11.788156 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:11.787831 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:12.786995 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:12.786960 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:12.787193 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:12.787097 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:12.866269 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:12.865550 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-95bvb"] Apr 16 04:24:12.867949 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:12.867525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:12.867949 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:12.867608 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:12.938684 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:12.938539 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:12.938684 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:12.938602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/32c323b6-9b7e-46de-ac37-f304c3267420-kubelet-config\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:12.938684 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:12.938630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/32c323b6-9b7e-46de-ac37-f304c3267420-dbus\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:13.039687 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:13.039266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:13.039687 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:13.039341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/32c323b6-9b7e-46de-ac37-f304c3267420-kubelet-config\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:13.039687 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:13.039371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/32c323b6-9b7e-46de-ac37-f304c3267420-dbus\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:13.039687 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.039405 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:13.039687 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.039473 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret podName:32c323b6-9b7e-46de-ac37-f304c3267420 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:13.53945704 +0000 UTC m=+9.355997112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret") pod "global-pull-secret-syncer-95bvb" (UID: "32c323b6-9b7e-46de-ac37-f304c3267420") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:13.039687 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:13.039547 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/32c323b6-9b7e-46de-ac37-f304c3267420-dbus\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:13.039687 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:13.039604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/32c323b6-9b7e-46de-ac37-f304c3267420-kubelet-config\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:13.342471 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:13.342382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:13.342662 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.342514 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:13.342662 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.342591 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs podName:b30a87b4-65a2-4504-be52-b10fb247dedb nodeName:}" failed. No retries permitted until 2026-04-16 04:24:21.342567852 +0000 UTC m=+17.159107920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs") pod "network-metrics-daemon-qhcj5" (UID: "b30a87b4-65a2-4504-be52-b10fb247dedb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:13.442948 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:13.442913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:13.443104 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.443060 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:13.443104 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.443081 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:13.443104 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.443093 2578 projected.go:194] Error preparing data for projected volume kube-api-access-n6m5d for pod openshift-network-diagnostics/network-check-target-mnzb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:13.443270 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.443153 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d podName:d17e7eff-b5d0-404c-bf53-695801a18097 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:21.443135085 +0000 UTC m=+17.259675149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-n6m5d" (UniqueName: "kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d") pod "network-check-target-mnzb4" (UID: "d17e7eff-b5d0-404c-bf53-695801a18097") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:13.544270 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:13.543756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:13.544270 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.543904 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:13.544270 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.543959 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret podName:32c323b6-9b7e-46de-ac37-f304c3267420 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:14.543940876 +0000 UTC m=+10.360480951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret") pod "global-pull-secret-syncer-95bvb" (UID: "32c323b6-9b7e-46de-ac37-f304c3267420") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:13.787582 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:13.787548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:13.787749 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:13.787700 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:14.550275 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:14.550232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:14.550843 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:14.550435 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:14.550843 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:14.550517 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret podName:32c323b6-9b7e-46de-ac37-f304c3267420 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:16.550490907 +0000 UTC m=+12.367030972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret") pod "global-pull-secret-syncer-95bvb" (UID: "32c323b6-9b7e-46de-ac37-f304c3267420") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:14.787896 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:14.787659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:14.788054 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:14.787710 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:14.788122 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:14.787977 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:14.788122 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:14.788101 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:15.787356 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:15.787319 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:15.787742 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:15.787441 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:16.568882 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:16.568843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:16.569083 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:16.568986 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:16.569083 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:16.569042 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret podName:32c323b6-9b7e-46de-ac37-f304c3267420 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:20.569029621 +0000 UTC m=+16.385569679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret") pod "global-pull-secret-syncer-95bvb" (UID: "32c323b6-9b7e-46de-ac37-f304c3267420") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:16.787176 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:16.787143 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:16.787366 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:16.787143 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:16.787366 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:16.787277 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:16.787701 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:16.787369 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:17.787245 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:17.787207 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:17.787445 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:17.787344 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:18.786800 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:18.786754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:18.786800 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:18.786795 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:18.786977 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:18.786895 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:18.787069 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:18.786992 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:19.787038 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:19.786941 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:19.787477 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:19.787074 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:20.600984 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:20.600922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:20.601169 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:20.601068 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:20.601169 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:20.601137 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret podName:32c323b6-9b7e-46de-ac37-f304c3267420 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:28.601119677 +0000 UTC m=+24.417659755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret") pod "global-pull-secret-syncer-95bvb" (UID: "32c323b6-9b7e-46de-ac37-f304c3267420") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:20.786797 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:20.786752 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:20.786797 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:20.786781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:20.787019 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:20.786882 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:20.787088 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:20.787015 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:21.407474 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:21.407436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:21.407702 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:21.407629 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:21.407756 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:21.407710 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs podName:b30a87b4-65a2-4504-be52-b10fb247dedb nodeName:}" failed. No retries permitted until 2026-04-16 04:24:37.407688254 +0000 UTC m=+33.224228314 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs") pod "network-metrics-daemon-qhcj5" (UID: "b30a87b4-65a2-4504-be52-b10fb247dedb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:21.508175 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:21.508134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:21.508356 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:21.508321 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:21.508356 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:21.508346 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:21.508356 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:21.508358 2578 projected.go:194] Error preparing data for projected volume kube-api-access-n6m5d for pod openshift-network-diagnostics/network-check-target-mnzb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:21.508479 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:21.508424 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d podName:d17e7eff-b5d0-404c-bf53-695801a18097 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:37.50840914 +0000 UTC m=+33.324949199 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-n6m5d" (UniqueName: "kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d") pod "network-check-target-mnzb4" (UID: "d17e7eff-b5d0-404c-bf53-695801a18097") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:21.787460 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:21.787427 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:21.787841 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:21.787549 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:22.787107 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:22.787077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:22.787315 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:22.787124 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:22.787315 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:22.787202 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:22.787430 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:22.787338 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:23.786929 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:23.786868 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:23.787311 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:23.787003 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:24.788163 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.788128 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:24.789025 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:24.788232 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:24.789025 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.788381 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:24.789025 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:24.788507 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:24.962674 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.962651 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:24:24.963197 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.963072 2578 generic.go:358] "Generic (PLEG): container finished" podID="b57f188f-8b5d-4bf1-9ab4-39e808fa255e" containerID="0c2aae3dd332ac5fd91810780bdcdb95d058799fd08783bc21f27ebeab8555b2" exitCode=1 Apr 16 04:24:24.963197 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.963157 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerStarted","Data":"89156d914d59c27c166b948c80495ca0f1a9b254a19e74b4a182aaef3eae4182"} Apr 16 04:24:24.963197 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.963189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerStarted","Data":"f9fb2f8a878edcddf6a313e420244d489265824543eb6a38cb68238b710ba4c2"} Apr 16 04:24:24.963388 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.963204 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerStarted","Data":"8a799d8a8867d4fa1ed347efcbfd65d20fff3d3b915dd004a510bf89f51ad946"} Apr 16 04:24:24.963388 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.963216 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerStarted","Data":"86dd212e754fd197906fc642146eba1430157906d08ec32e4dacbcd23b4dc15d"} Apr 16 04:24:24.963388 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.963227 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerDied","Data":"0c2aae3dd332ac5fd91810780bdcdb95d058799fd08783bc21f27ebeab8555b2"} Apr 16 04:24:24.963388 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.963240 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerStarted","Data":"aa6c7919e8bbe8713a114e36e5f972869e4af725f06691f6b845bbc5380281c2"} Apr 16 04:24:24.966462 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.966421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rk4kv" event={"ID":"6effaf82-47fd-4e1e-904a-407184af8a9d","Type":"ContainerStarted","Data":"db7a2a6682aab293f0954387bc0e9e008eff4dee1ab490ff308269604262409b"} Apr 16 04:24:24.970744 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.970711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal" event={"ID":"1a60e730c1a02bac81e499c95b0d4aa1","Type":"ContainerStarted","Data":"ef161738cd3e0ca1998c9bcfd3fcb1fdcc6545d46c6de9786a1da0fac31d3087"} Apr 16 04:24:24.972985 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.972951 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ws575" event={"ID":"9a43d074-9490-4217-acb3-18783e647e75","Type":"ContainerStarted","Data":"a52b4a725968dd364b5e9d3d534581922b5e421515c3bc76fd8be6e44bcd870b"} Apr 16 04:24:24.987069 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.986522 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rk4kv" podStartSLOduration=2.7577126229999998 podStartE2EDuration="20.986503936s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:06.001716681 +0000 UTC m=+1.818256739" lastFinishedPulling="2026-04-16 04:24:24.230507991 +0000 UTC m=+20.047048052" observedRunningTime="2026-04-16 04:24:24.984107579 +0000 UTC m=+20.800647660" watchObservedRunningTime="2026-04-16 04:24:24.986503936 +0000 UTC m=+20.803044019" Apr 16 04:24:24.997682 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:24.997423 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-211.ec2.internal" podStartSLOduration=20.99740405 podStartE2EDuration="20.99740405s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:24:24.997311399 +0000 UTC m=+20.813851477" watchObservedRunningTime="2026-04-16 04:24:24.99740405 +0000 UTC m=+20.813944133" Apr 16 04:24:25.011881 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.011837 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ws575" podStartSLOduration=1.914974111 podStartE2EDuration="20.011823385s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="2026-04-16 04:24:06.099652759 +0000 UTC m=+1.916192821" lastFinishedPulling="2026-04-16 04:24:24.196502035 +0000 UTC m=+20.013042095" observedRunningTime="2026-04-16 04:24:25.011623872 +0000 UTC m=+20.828163954" watchObservedRunningTime="2026-04-16 04:24:25.011823385 +0000 UTC m=+20.828363464" Apr 16 04:24:25.787145 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.787122 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:25.787245 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:25.787221 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:25.812917 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.812894 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 04:24:25.976475 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.976441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" event={"ID":"25372649-86ad-45ac-ae3f-a871b8e3caa0","Type":"ContainerStarted","Data":"7ad740b9111e788002b2822d94356d886a0d3d64081ef3de666988a653f25ca5"} Apr 16 04:24:25.976475 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.976475 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" event={"ID":"25372649-86ad-45ac-ae3f-a871b8e3caa0","Type":"ContainerStarted","Data":"e2fe9dfbb5c292a977e359a8684a28d0673d513aefb349826f74f6de3fefd26b"} Apr 16 04:24:25.977743 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.977709 2578 generic.go:358] "Generic (PLEG): container finished" podID="0763391d-17aa-4fb0-a753-c705589537ab" containerID="54c07346ac384b281ac76a43945934ce5d2a3d29febc073ec7777f8ba8b1ab72" exitCode=0 Apr 16 04:24:25.977858 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.977781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-btwn7" event={"ID":"0763391d-17aa-4fb0-a753-c705589537ab","Type":"ContainerDied","Data":"54c07346ac384b281ac76a43945934ce5d2a3d29febc073ec7777f8ba8b1ab72"} Apr 16 04:24:25.979124 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.979043 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ctskn" event={"ID":"e9738669-31c5-4a1c-8e9b-4f0691464165","Type":"ContainerStarted","Data":"2ba414827b9d6e684cae5c40c4146788d2f789b8868dcb47cd86d5173e8cde71"} Apr 16 04:24:25.980317 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.980270 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w4n2g" event={"ID":"97c67a71-c877-4bde-ae72-3f9915f6cb95","Type":"ContainerStarted","Data":"ac957e3693693d34818060b1ce4d95b029c129f1890d8a986cc3b125fb07fd6a"} Apr 16 04:24:25.981543 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.981511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s72p4" event={"ID":"6630844f-3950-46f3-b23d-355ceec908ec","Type":"ContainerStarted","Data":"92a897b654bc3f8e8f72bf5d019ddd7f8df2a46bf8c512a9d78af3fc163ec9e7"} Apr 16 04:24:25.982776 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.982746 2578 generic.go:358] "Generic (PLEG): container finished" podID="5592b42208bf0b26cc8688ad52a01ff0" containerID="5edc6a2bd512b6cdb460add701f8789479e7960d28a4e5019d7267b85e35e44d" exitCode=0 Apr 16 04:24:25.982867 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.982831 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" event={"ID":"5592b42208bf0b26cc8688ad52a01ff0","Type":"ContainerDied","Data":"5edc6a2bd512b6cdb460add701f8789479e7960d28a4e5019d7267b85e35e44d"} Apr 16 04:24:25.984150 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:25.984127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-97kqm" event={"ID":"dd90ea8e-03cf-460f-8525-c28183fc3a33","Type":"ContainerStarted","Data":"55c45bb188db56c4e184db13551a7142906bfc6d391b6f4b695edfef95f56c15"} Apr 16 04:24:26.040482 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.040380 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ctskn" podStartSLOduration=3.862392125 podStartE2EDuration="22.040364976s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:06.01809682 +0000 UTC m=+1.834636878" lastFinishedPulling="2026-04-16 04:24:24.196069668 +0000 UTC m=+20.012609729" observedRunningTime="2026-04-16 04:24:26.027571448 +0000 UTC m=+21.844111529" watchObservedRunningTime="2026-04-16 04:24:26.040364976 +0000 UTC m=+21.856905057" Apr 16 04:24:26.040697 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.040673 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-97kqm" podStartSLOduration=2.9289750249999997 podStartE2EDuration="21.040665002s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="2026-04-16 04:24:06.084795277 +0000 UTC m=+1.901335336" lastFinishedPulling="2026-04-16 04:24:24.196485242 +0000 UTC m=+20.013025313" observedRunningTime="2026-04-16 04:24:26.040111792 +0000 UTC m=+21.856651873" watchObservedRunningTime="2026-04-16 04:24:26.040665002 +0000 UTC m=+21.857205082" Apr 16 04:24:26.053268 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.053205 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-w4n2g" podStartSLOduration=3.84936963 podStartE2EDuration="22.053189174s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:05.990216671 +0000 UTC m=+1.806756731" lastFinishedPulling="2026-04-16 04:24:24.194036204 +0000 UTC m=+20.010576275" observedRunningTime="2026-04-16 04:24:26.053065597 +0000 UTC m=+21.869605679" watchObservedRunningTime="2026-04-16 04:24:26.053189174 +0000 UTC m=+21.869729255" Apr 16 04:24:26.067940 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.067901 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s72p4" podStartSLOduration=4.043528641 podStartE2EDuration="22.067889681s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:06.022139742 +0000 UTC m=+1.838679801" lastFinishedPulling="2026-04-16 04:24:24.04650077 +0000 UTC m=+19.863040841" observedRunningTime="2026-04-16 04:24:26.067376931 +0000 UTC m=+21.883917011" watchObservedRunningTime="2026-04-16 04:24:26.067889681 +0000 UTC m=+21.884429760" Apr 16 04:24:26.738339 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.738221 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T04:24:25.812911339Z","UUID":"393be4d6-a2fa-4f4b-b05c-75c6dd2f197c","Handler":null,"Name":"","Endpoint":""} Apr 16 04:24:26.740326 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.740306 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 04:24:26.740417 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.740336 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 04:24:26.787076 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.787043 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:26.787239 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:26.787201 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:26.787757 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.787722 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:26.787871 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:26.787825 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:26.988672 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.988647 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:24:26.989137 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.988988 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerStarted","Data":"e8fd99651cf7e67e07562eec1d9de7ce3d77490993586e299f68bd3ceb809456"} Apr 16 04:24:26.990891 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.990860 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" event={"ID":"25372649-86ad-45ac-ae3f-a871b8e3caa0","Type":"ContainerStarted","Data":"19edf123a373818a0662bfea02dde906884e53a82c325a98c28ce1c3179bda8b"} Apr 16 04:24:26.992610 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:26.992567 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" event={"ID":"5592b42208bf0b26cc8688ad52a01ff0","Type":"ContainerStarted","Data":"9361677fe8be0fe204031d8adaf15f92cf990857e4083f085461837846ff8097"} Apr 16 04:24:27.025009 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:27.024961 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hg5kl" podStartSLOduration=1.354451646 podStartE2EDuration="22.024943255s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="2026-04-16 04:24:06.047866547 +0000 UTC m=+1.864406606" lastFinishedPulling="2026-04-16 04:24:26.718358154 +0000 UTC m=+22.534898215" observedRunningTime="2026-04-16 04:24:27.008796288 +0000 UTC m=+22.825336368" watchObservedRunningTime="2026-04-16 04:24:27.024943255 +0000 UTC m=+22.841483336" Apr 16 04:24:27.025651 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:27.025621 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-211.ec2.internal" podStartSLOduration=23.025613015 podStartE2EDuration="23.025613015s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:24:27.025086935 +0000 UTC m=+22.841626997" watchObservedRunningTime="2026-04-16 04:24:27.025613015 +0000 UTC m=+22.842153100" Apr 16 04:24:27.787579 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:27.787548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:27.787772 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:27.787682 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:28.661795 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:28.661755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:28.662192 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:28.661880 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:28.662192 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:28.661943 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret podName:32c323b6-9b7e-46de-ac37-f304c3267420 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:44.661929352 +0000 UTC m=+40.478469414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret") pod "global-pull-secret-syncer-95bvb" (UID: "32c323b6-9b7e-46de-ac37-f304c3267420") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:28.787014 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:28.786980 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:28.787187 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:28.787096 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:28.787250 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:28.787189 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:28.787334 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:28.787311 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:29.786649 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:29.786618 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:29.787102 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:29.786738 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:29.870914 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:29.870725 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:29.871361 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:29.871343 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:30.001345 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:30.001320 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:24:30.002042 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:30.001997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerStarted","Data":"ea36affa701f68e977c93b3b364c4127c7d587ae051e1561a929d63294be699d"} Apr 16 04:24:30.002377 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:30.002327 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:30.002598 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:30.002581 2578 scope.go:117] "RemoveContainer" containerID="0c2aae3dd332ac5fd91810780bdcdb95d058799fd08783bc21f27ebeab8555b2" Apr 16 04:24:30.003218 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:30.003050 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-97kqm" Apr 16 04:24:30.787657 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:30.787480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:30.788286 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:30.787531 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:30.788286 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:30.787725 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:30.788286 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:30.787813 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:31.006226 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.006200 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:24:31.006551 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.006520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" event={"ID":"b57f188f-8b5d-4bf1-9ab4-39e808fa255e","Type":"ContainerStarted","Data":"bef65e9d319bc967fb7eca5a3b2d7b4b5429d4f806035865c5c514431958ad57"} Apr 16 04:24:31.006654 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.006612 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 04:24:31.006815 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.006782 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:31.006815 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.006811 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:31.008325 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.008287 2578 generic.go:358] "Generic (PLEG): container finished" podID="0763391d-17aa-4fb0-a753-c705589537ab" containerID="e2b55e3e67a3bd6805d25099df3feee522438a24ee16907c5d1585eb678a487e" exitCode=0 Apr 16 04:24:31.008420 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.008390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-btwn7" event={"ID":"0763391d-17aa-4fb0-a753-c705589537ab","Type":"ContainerDied","Data":"e2b55e3e67a3bd6805d25099df3feee522438a24ee16907c5d1585eb678a487e"} Apr 16 04:24:31.021924 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.021905 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:31.022008 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.021966 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:31.033214 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.033179 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" podStartSLOduration=7.761557223 podStartE2EDuration="26.033166416s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="2026-04-16 04:24:06.065832585 +0000 UTC m=+1.882372643" lastFinishedPulling="2026-04-16 04:24:24.33744176 +0000 UTC m=+20.153981836" observedRunningTime="2026-04-16 04:24:31.031647435 +0000 UTC m=+26.848187515" watchObservedRunningTime="2026-04-16 04:24:31.033166416 +0000 UTC m=+26.849706495" Apr 16 04:24:31.787448 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.787414 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:31.787611 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:31.787560 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:31.848606 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.848580 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qhcj5"] Apr 16 04:24:31.851833 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.851512 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-95bvb"] Apr 16 04:24:31.851833 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.851651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:31.851833 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:31.851753 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:31.852254 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.852232 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mnzb4"] Apr 16 04:24:31.852378 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:31.852365 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:31.852480 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:31.852462 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:32.012050 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:32.012022 2578 generic.go:358] "Generic (PLEG): container finished" podID="0763391d-17aa-4fb0-a753-c705589537ab" containerID="607593258ecfd6ab6c5ca824eb93b2643cebd831f3394dddf504b87b2010890e" exitCode=0 Apr 16 04:24:32.012206 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:32.012138 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-btwn7" event={"ID":"0763391d-17aa-4fb0-a753-c705589537ab","Type":"ContainerDied","Data":"607593258ecfd6ab6c5ca824eb93b2643cebd831f3394dddf504b87b2010890e"} Apr 16 04:24:32.012272 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:32.012218 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 04:24:32.013325 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:32.012681 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:32.013325 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:32.012801 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:33.015977 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:33.015947 2578 generic.go:358] "Generic (PLEG): container finished" podID="0763391d-17aa-4fb0-a753-c705589537ab" containerID="9e9c1dd698ec64d57931966e3dfd31f5b19ba8ee5507b47f3103abc71fc361a3" exitCode=0 Apr 16 04:24:33.016351 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:33.016017 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-btwn7" event={"ID":"0763391d-17aa-4fb0-a753-c705589537ab","Type":"ContainerDied","Data":"9e9c1dd698ec64d57931966e3dfd31f5b19ba8ee5507b47f3103abc71fc361a3"} Apr 16 04:24:33.016351 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:33.016119 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 04:24:33.787584 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:33.787550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:33.787584 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:33.787573 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:33.787828 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:33.787550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:33.787828 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:33.787666 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:33.787828 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:33.787748 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:33.787954 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:33.787866 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:34.580353 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:34.580319 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:24:34.580903 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:34.580598 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 04:24:34.591716 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:34.591628 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" podUID="b57f188f-8b5d-4bf1-9ab4-39e808fa255e" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 04:24:34.601519 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:34.601478 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" podUID="b57f188f-8b5d-4bf1-9ab4-39e808fa255e" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 04:24:35.787347 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:35.787098 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:35.787792 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:35.787134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:35.787792 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:35.787452 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhcj5" podUID="b30a87b4-65a2-4504-be52-b10fb247dedb" Apr 16 04:24:35.787792 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:35.787145 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:35.787792 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:35.787547 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-95bvb" podUID="32c323b6-9b7e-46de-ac37-f304c3267420" Apr 16 04:24:35.787792 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:35.787587 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnzb4" podUID="d17e7eff-b5d0-404c-bf53-695801a18097" Apr 16 04:24:37.426340 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.426246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:37.426752 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.426397 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:37.426752 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.426458 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs podName:b30a87b4-65a2-4504-be52-b10fb247dedb nodeName:}" failed. No retries permitted until 2026-04-16 04:25:09.426443314 +0000 UTC m=+65.242983372 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs") pod "network-metrics-daemon-qhcj5" (UID: "b30a87b4-65a2-4504-be52-b10fb247dedb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:37.509716 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.509687 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-211.ec2.internal" event="NodeReady" Apr 16 04:24:37.509887 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.509838 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 04:24:37.527145 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.527117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:37.527317 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.527290 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:37.527387 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.527323 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:37.527387 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.527337 2578 projected.go:194] Error preparing data for projected volume kube-api-access-n6m5d for pod openshift-network-diagnostics/network-check-target-mnzb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:37.527474 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.527388 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d podName:d17e7eff-b5d0-404c-bf53-695801a18097 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:09.527374897 +0000 UTC m=+65.343914954 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-n6m5d" (UniqueName: "kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d") pod "network-check-target-mnzb4" (UID: "d17e7eff-b5d0-404c-bf53-695801a18097") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:37.549176 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.549145 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zn869"] Apr 16 04:24:37.573235 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.573204 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bntvm"] Apr 16 04:24:37.573450 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.573427 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.575949 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.575925 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 04:24:37.575949 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.575943 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-smz8d\"" Apr 16 04:24:37.576143 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.575992 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 04:24:37.591770 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.591752 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zn869"] Apr 16 04:24:37.591770 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.591772 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bntvm"] Apr 16 04:24:37.591924 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.591865 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:37.594525 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.594497 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 04:24:37.594635 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.594549 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 04:24:37.594635 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.594612 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-d4mbv\"" Apr 16 04:24:37.594819 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.594802 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 04:24:37.728380 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.728334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5lc7\" (UniqueName: \"kubernetes.io/projected/4248c02e-a96d-4c4c-a829-8fa7ddd59809-kube-api-access-w5lc7\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.728569 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.728388 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.728569 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.728415 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4248c02e-a96d-4c4c-a829-8fa7ddd59809-tmp-dir\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.728569 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.728465 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4248c02e-a96d-4c4c-a829-8fa7ddd59809-config-volume\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.728569 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.728503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:37.728569 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.728522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvhbd\" (UniqueName: \"kubernetes.io/projected/9cf31f3f-502a-403b-a015-b0d26d2ac92f-kube-api-access-vvhbd\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:37.786715 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.786676 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:37.786911 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.786692 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:24:37.786911 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.786808 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:24:37.789589 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.789567 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 04:24:37.789739 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.789593 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l4rjf\"" Apr 16 04:24:37.792783 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.789942 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 04:24:37.792783 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.790068 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 04:24:37.792783 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.790214 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 04:24:37.792783 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.790429 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xfxbt\"" Apr 16 04:24:37.829871 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.829843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5lc7\" (UniqueName: \"kubernetes.io/projected/4248c02e-a96d-4c4c-a829-8fa7ddd59809-kube-api-access-w5lc7\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.830010 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.829887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.830010 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.829916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4248c02e-a96d-4c4c-a829-8fa7ddd59809-tmp-dir\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.830010 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.829948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4248c02e-a96d-4c4c-a829-8fa7ddd59809-config-volume\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.830010 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.829986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:37.830010 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.830008 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvhbd\" (UniqueName: \"kubernetes.io/projected/9cf31f3f-502a-403b-a015-b0d26d2ac92f-kube-api-access-vvhbd\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:37.830267 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.830013 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:37.830267 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.830080 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls podName:4248c02e-a96d-4c4c-a829-8fa7ddd59809 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:38.330058924 +0000 UTC m=+34.146598985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls") pod "dns-default-zn869" (UID: "4248c02e-a96d-4c4c-a829-8fa7ddd59809") : secret "dns-default-metrics-tls" not found Apr 16 04:24:37.830411 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.830275 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:37.830411 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:37.830347 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert podName:9cf31f3f-502a-403b-a015-b0d26d2ac92f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:38.33033354 +0000 UTC m=+34.146873602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert") pod "ingress-canary-bntvm" (UID: "9cf31f3f-502a-403b-a015-b0d26d2ac92f") : secret "canary-serving-cert" not found Apr 16 04:24:37.830637 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.830608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4248c02e-a96d-4c4c-a829-8fa7ddd59809-config-volume\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.840002 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.839962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4248c02e-a96d-4c4c-a829-8fa7ddd59809-tmp-dir\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.843651 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.843629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5lc7\" (UniqueName: \"kubernetes.io/projected/4248c02e-a96d-4c4c-a829-8fa7ddd59809-kube-api-access-w5lc7\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:37.843789 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:37.843763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvhbd\" (UniqueName: \"kubernetes.io/projected/9cf31f3f-502a-403b-a015-b0d26d2ac92f-kube-api-access-vvhbd\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:38.335252 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:38.335207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:38.335526 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:38.335311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:38.335526 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:38.335379 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:38.335526 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:38.335449 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert podName:9cf31f3f-502a-403b-a015-b0d26d2ac92f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:39.335430778 +0000 UTC m=+35.151970835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert") pod "ingress-canary-bntvm" (UID: "9cf31f3f-502a-403b-a015-b0d26d2ac92f") : secret "canary-serving-cert" not found Apr 16 04:24:38.335526 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:38.335380 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:38.335711 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:38.335559 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls podName:4248c02e-a96d-4c4c-a829-8fa7ddd59809 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:39.335539915 +0000 UTC m=+35.152079991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls") pod "dns-default-zn869" (UID: "4248c02e-a96d-4c4c-a829-8fa7ddd59809") : secret "dns-default-metrics-tls" not found Apr 16 04:24:39.031527 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:39.031447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-btwn7" event={"ID":"0763391d-17aa-4fb0-a753-c705589537ab","Type":"ContainerStarted","Data":"834c05a92926558ebd978cfbb40e404fb1d552a3aff0b71febee93d3fb0beb13"} Apr 16 04:24:39.341760 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:39.341678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:39.341901 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:39.341776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:39.341901 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:39.341832 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:39.341901 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:39.341859 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:39.341901 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:39.341898 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert podName:9cf31f3f-502a-403b-a015-b0d26d2ac92f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:41.341880546 +0000 UTC m=+37.158420625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert") pod "ingress-canary-bntvm" (UID: "9cf31f3f-502a-403b-a015-b0d26d2ac92f") : secret "canary-serving-cert" not found Apr 16 04:24:39.342041 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:39.341912 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls podName:4248c02e-a96d-4c4c-a829-8fa7ddd59809 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:41.341906628 +0000 UTC m=+37.158446689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls") pod "dns-default-zn869" (UID: "4248c02e-a96d-4c4c-a829-8fa7ddd59809") : secret "dns-default-metrics-tls" not found Apr 16 04:24:40.035652 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:40.035621 2578 generic.go:358] "Generic (PLEG): container finished" podID="0763391d-17aa-4fb0-a753-c705589537ab" containerID="834c05a92926558ebd978cfbb40e404fb1d552a3aff0b71febee93d3fb0beb13" exitCode=0 Apr 16 04:24:40.036072 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:40.035667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-btwn7" event={"ID":"0763391d-17aa-4fb0-a753-c705589537ab","Type":"ContainerDied","Data":"834c05a92926558ebd978cfbb40e404fb1d552a3aff0b71febee93d3fb0beb13"} Apr 16 04:24:41.039880 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:41.039850 2578 generic.go:358] "Generic (PLEG): container finished" podID="0763391d-17aa-4fb0-a753-c705589537ab" containerID="b1763a63e26a2527698ba7f74cb223178879af0bf5cd8f791cd727d638eaa9c3" exitCode=0 Apr 16 04:24:41.040349 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:41.039897 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-btwn7" event={"ID":"0763391d-17aa-4fb0-a753-c705589537ab","Type":"ContainerDied","Data":"b1763a63e26a2527698ba7f74cb223178879af0bf5cd8f791cd727d638eaa9c3"} Apr 16 04:24:41.358395 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:41.358310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:41.358395 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:41.358360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:41.358560 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:41.358430 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:41.358560 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:41.358441 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:41.358560 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:41.358504 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert podName:9cf31f3f-502a-403b-a015-b0d26d2ac92f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:45.358485962 +0000 UTC m=+41.175026043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert") pod "ingress-canary-bntvm" (UID: "9cf31f3f-502a-403b-a015-b0d26d2ac92f") : secret "canary-serving-cert" not found Apr 16 04:24:41.358560 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:41.358521 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls podName:4248c02e-a96d-4c4c-a829-8fa7ddd59809 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:45.35851486 +0000 UTC m=+41.175054918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls") pod "dns-default-zn869" (UID: "4248c02e-a96d-4c4c-a829-8fa7ddd59809") : secret "dns-default-metrics-tls" not found Apr 16 04:24:42.044332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:42.044140 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-btwn7" event={"ID":"0763391d-17aa-4fb0-a753-c705589537ab","Type":"ContainerStarted","Data":"7fe34aced6646e21ea23eb11b3ef4bafaf9e8a52d19c60107427c20da0b7435b"} Apr 16 04:24:42.072013 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:42.071967 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-btwn7" podStartSLOduration=4.323305857 podStartE2EDuration="37.071952847s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="2026-04-16 04:24:06.031522253 +0000 UTC m=+1.848062312" lastFinishedPulling="2026-04-16 04:24:38.78016924 +0000 UTC m=+34.596709302" observedRunningTime="2026-04-16 04:24:42.071657474 +0000 UTC m=+37.888197553" watchObservedRunningTime="2026-04-16 04:24:42.071952847 +0000 UTC m=+37.888492927" Apr 16 04:24:44.681901 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:44.681864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:44.684564 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:44.684546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32c323b6-9b7e-46de-ac37-f304c3267420-original-pull-secret\") pod \"global-pull-secret-syncer-95bvb\" (UID: \"32c323b6-9b7e-46de-ac37-f304c3267420\") " pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:44.701653 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:44.701630 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95bvb" Apr 16 04:24:44.911171 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:44.911140 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-95bvb"] Apr 16 04:24:44.915980 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:24:44.915947 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c323b6_9b7e_46de_ac37_f304c3267420.slice/crio-8edbec2ae8320d168c09428989f71fde2b9f81113ef47166909730ed08a540cf WatchSource:0}: Error finding container 8edbec2ae8320d168c09428989f71fde2b9f81113ef47166909730ed08a540cf: Status 404 returned error can't find the container with id 8edbec2ae8320d168c09428989f71fde2b9f81113ef47166909730ed08a540cf Apr 16 04:24:45.050526 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:45.050490 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-95bvb" event={"ID":"32c323b6-9b7e-46de-ac37-f304c3267420","Type":"ContainerStarted","Data":"8edbec2ae8320d168c09428989f71fde2b9f81113ef47166909730ed08a540cf"} Apr 16 04:24:45.386785 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:45.386685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:45.386785 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:45.386740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:45.386994 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:45.386840 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:45.386994 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:45.386869 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:45.386994 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:45.386907 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls podName:4248c02e-a96d-4c4c-a829-8fa7ddd59809 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:53.386890477 +0000 UTC m=+49.203430534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls") pod "dns-default-zn869" (UID: "4248c02e-a96d-4c4c-a829-8fa7ddd59809") : secret "dns-default-metrics-tls" not found Apr 16 04:24:45.386994 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:45.386921 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert podName:9cf31f3f-502a-403b-a015-b0d26d2ac92f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:53.386915029 +0000 UTC m=+49.203455086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert") pod "ingress-canary-bntvm" (UID: "9cf31f3f-502a-403b-a015-b0d26d2ac92f") : secret "canary-serving-cert" not found Apr 16 04:24:51.063569 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:51.063538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-95bvb" event={"ID":"32c323b6-9b7e-46de-ac37-f304c3267420","Type":"ContainerStarted","Data":"219c2ef0c3cc1166921eb7ffdc6c766418d3fdd5484bb5c9f988411f9bab8123"} Apr 16 04:24:51.077670 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:51.077628 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-95bvb" podStartSLOduration=34.016241222 podStartE2EDuration="39.077614145s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:44.917681348 +0000 UTC m=+40.734221406" lastFinishedPulling="2026-04-16 04:24:49.979054268 +0000 UTC m=+45.795594329" observedRunningTime="2026-04-16 04:24:51.076932771 +0000 UTC m=+46.893472853" watchObservedRunningTime="2026-04-16 04:24:51.077614145 +0000 UTC m=+46.894154220" Apr 16 04:24:53.441599 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:53.441564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:24:53.441974 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:24:53.441630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:24:53.441974 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:53.441710 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:53.441974 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:53.441763 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls podName:4248c02e-a96d-4c4c-a829-8fa7ddd59809 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:09.441748611 +0000 UTC m=+65.258288669 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls") pod "dns-default-zn869" (UID: "4248c02e-a96d-4c4c-a829-8fa7ddd59809") : secret "dns-default-metrics-tls" not found Apr 16 04:24:53.441974 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:53.441709 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:53.441974 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:24:53.441848 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert podName:9cf31f3f-502a-403b-a015-b0d26d2ac92f nodeName:}" failed. No retries permitted until 2026-04-16 04:25:09.4418337 +0000 UTC m=+65.258373763 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert") pod "ingress-canary-bntvm" (UID: "9cf31f3f-502a-403b-a015-b0d26d2ac92f") : secret "canary-serving-cert" not found Apr 16 04:25:04.601790 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:04.601757 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vmzk" Apr 16 04:25:06.130914 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.130879 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6"] Apr 16 04:25:06.135523 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.135505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" Apr 16 04:25:06.139356 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.139328 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 04:25:06.139356 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.139344 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 04:25:06.139544 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.139344 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 04:25:06.139544 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.139384 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 04:25:06.139544 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.139348 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-4smmf\"" Apr 16 04:25:06.140935 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.140918 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6"] Apr 16 04:25:06.162888 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.162865 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6"] Apr 16 04:25:06.165564 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.165550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.168217 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.168089 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 04:25:06.168217 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.168125 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 04:25:06.168390 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.168349 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 04:25:06.168551 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.168537 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 04:25:06.174958 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.174942 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6"] Apr 16 04:25:06.228009 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.227979 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-hub\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.228150 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.228015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th64v\" (UniqueName: \"kubernetes.io/projected/e382fd46-564e-4a07-898c-8f146dfae64b-kube-api-access-th64v\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.228150 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.228055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/e382fd46-564e-4a07-898c-8f146dfae64b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.228150 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.228080 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.228150 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.228107 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqddj\" (UniqueName: \"kubernetes.io/projected/940307cd-016d-4015-b6f4-c3b5bc4685b5-kube-api-access-gqddj\") pod \"managed-serviceaccount-addon-agent-57599cd6c-2b9m6\" (UID: \"940307cd-016d-4015-b6f4-c3b5bc4685b5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" Apr 16 04:25:06.228150 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.228134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/940307cd-016d-4015-b6f4-c3b5bc4685b5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-57599cd6c-2b9m6\" (UID: \"940307cd-016d-4015-b6f4-c3b5bc4685b5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" Apr 16 04:25:06.228355 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.228156 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.228355 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.228243 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-ca\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.329504 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.329471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-hub\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.329504 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.329508 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th64v\" (UniqueName: \"kubernetes.io/projected/e382fd46-564e-4a07-898c-8f146dfae64b-kube-api-access-th64v\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.329685 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.329544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/e382fd46-564e-4a07-898c-8f146dfae64b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.329685 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.329617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.329685 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.329665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqddj\" (UniqueName: \"kubernetes.io/projected/940307cd-016d-4015-b6f4-c3b5bc4685b5-kube-api-access-gqddj\") pod \"managed-serviceaccount-addon-agent-57599cd6c-2b9m6\" (UID: \"940307cd-016d-4015-b6f4-c3b5bc4685b5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" Apr 16 04:25:06.329829 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.329696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/940307cd-016d-4015-b6f4-c3b5bc4685b5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-57599cd6c-2b9m6\" (UID: \"940307cd-016d-4015-b6f4-c3b5bc4685b5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" Apr 16 04:25:06.329930 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.329899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.330075 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.330004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-ca\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.330377 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.330353 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/e382fd46-564e-4a07-898c-8f146dfae64b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.332081 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.332056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.332314 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.332279 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-hub\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.332398 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.332372 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/940307cd-016d-4015-b6f4-c3b5bc4685b5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-57599cd6c-2b9m6\" (UID: \"940307cd-016d-4015-b6f4-c3b5bc4685b5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" Apr 16 04:25:06.332630 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.332608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-ca\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.332713 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.332697 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e382fd46-564e-4a07-898c-8f146dfae64b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.337629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.337607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqddj\" (UniqueName: \"kubernetes.io/projected/940307cd-016d-4015-b6f4-c3b5bc4685b5-kube-api-access-gqddj\") pod \"managed-serviceaccount-addon-agent-57599cd6c-2b9m6\" (UID: \"940307cd-016d-4015-b6f4-c3b5bc4685b5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" Apr 16 04:25:06.337629 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.337622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th64v\" (UniqueName: \"kubernetes.io/projected/e382fd46-564e-4a07-898c-8f146dfae64b-kube-api-access-th64v\") pod \"cluster-proxy-proxy-agent-6f8bc7bf7c-792r6\" (UID: \"e382fd46-564e-4a07-898c-8f146dfae64b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.455620 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.455527 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" Apr 16 04:25:06.473407 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.473358 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:25:06.573557 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.573527 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6"] Apr 16 04:25:06.576600 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:25:06.576568 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod940307cd_016d_4015_b6f4_c3b5bc4685b5.slice/crio-bbe1f19dd0b94b5afc5e4997ba813c822f6b6926cc62025320b28935ca7acca4 WatchSource:0}: Error finding container bbe1f19dd0b94b5afc5e4997ba813c822f6b6926cc62025320b28935ca7acca4: Status 404 returned error can't find the container with id bbe1f19dd0b94b5afc5e4997ba813c822f6b6926cc62025320b28935ca7acca4 Apr 16 04:25:06.590443 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:06.590415 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6"] Apr 16 04:25:06.593699 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:25:06.593678 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode382fd46_564e_4a07_898c_8f146dfae64b.slice/crio-b8ae97033be721d34179565219fd0a5b82ead6370b3882a449f8466c646854a3 WatchSource:0}: Error finding container b8ae97033be721d34179565219fd0a5b82ead6370b3882a449f8466c646854a3: Status 404 returned error can't find the container with id b8ae97033be721d34179565219fd0a5b82ead6370b3882a449f8466c646854a3 Apr 16 04:25:07.095849 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:07.095807 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" event={"ID":"e382fd46-564e-4a07-898c-8f146dfae64b","Type":"ContainerStarted","Data":"b8ae97033be721d34179565219fd0a5b82ead6370b3882a449f8466c646854a3"} Apr 16 04:25:07.096912 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:07.096883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" event={"ID":"940307cd-016d-4015-b6f4-c3b5bc4685b5","Type":"ContainerStarted","Data":"bbe1f19dd0b94b5afc5e4997ba813c822f6b6926cc62025320b28935ca7acca4"} Apr 16 04:25:09.453522 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.453479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:25:09.453982 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.453561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:25:09.453982 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.453619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:25:09.453982 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:09.453698 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:25:09.453982 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:09.453741 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:25:09.453982 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:09.453778 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert podName:9cf31f3f-502a-403b-a015-b0d26d2ac92f nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.453756007 +0000 UTC m=+97.270296066 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert") pod "ingress-canary-bntvm" (UID: "9cf31f3f-502a-403b-a015-b0d26d2ac92f") : secret "canary-serving-cert" not found Apr 16 04:25:09.453982 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:09.453801 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls podName:4248c02e-a96d-4c4c-a829-8fa7ddd59809 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.45379033 +0000 UTC m=+97.270330393 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls") pod "dns-default-zn869" (UID: "4248c02e-a96d-4c4c-a829-8fa7ddd59809") : secret "dns-default-metrics-tls" not found Apr 16 04:25:09.456454 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.456427 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 04:25:09.464107 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:09.464089 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 04:25:09.464235 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:09.464146 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs podName:b30a87b4-65a2-4504-be52-b10fb247dedb nodeName:}" failed. No retries permitted until 2026-04-16 04:26:13.464127817 +0000 UTC m=+129.280667898 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs") pod "network-metrics-daemon-qhcj5" (UID: "b30a87b4-65a2-4504-be52-b10fb247dedb") : secret "metrics-daemon-secret" not found Apr 16 04:25:09.554961 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.554923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:25:09.557573 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.557548 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 04:25:09.567808 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.567782 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 04:25:09.578609 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.578585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6m5d\" (UniqueName: \"kubernetes.io/projected/d17e7eff-b5d0-404c-bf53-695801a18097-kube-api-access-n6m5d\") pod \"network-check-target-mnzb4\" (UID: \"d17e7eff-b5d0-404c-bf53-695801a18097\") " pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:25:09.611548 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.611521 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l4rjf\"" Apr 16 04:25:09.619688 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:09.619668 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:25:10.344548 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:10.344514 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mnzb4"] Apr 16 04:25:10.347969 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:25:10.347944 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd17e7eff_b5d0_404c_bf53_695801a18097.slice/crio-b30f2081e9a6eea476f963ca71f070cd4b981bf3fcbe872cc0ba6d8386537078 WatchSource:0}: Error finding container b30f2081e9a6eea476f963ca71f070cd4b981bf3fcbe872cc0ba6d8386537078: Status 404 returned error can't find the container with id b30f2081e9a6eea476f963ca71f070cd4b981bf3fcbe872cc0ba6d8386537078 Apr 16 04:25:11.108041 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:11.108001 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" event={"ID":"940307cd-016d-4015-b6f4-c3b5bc4685b5","Type":"ContainerStarted","Data":"f320f269a2047ff17a2cd2c74d8823d3c5256c1369860c5e56583573ea440684"} Apr 16 04:25:11.109846 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:11.109765 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" event={"ID":"e382fd46-564e-4a07-898c-8f146dfae64b","Type":"ContainerStarted","Data":"b37ac1c633275c5d091d5f3cc22dbd60bbf30e2d01f2a1d328a745dce41c43e9"} Apr 16 04:25:11.111464 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:11.111440 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mnzb4" event={"ID":"d17e7eff-b5d0-404c-bf53-695801a18097","Type":"ContainerStarted","Data":"b30f2081e9a6eea476f963ca71f070cd4b981bf3fcbe872cc0ba6d8386537078"} Apr 16 04:25:11.122942 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:11.122789 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57599cd6c-2b9m6" podStartSLOduration=1.469546368 podStartE2EDuration="5.122772951s" podCreationTimestamp="2026-04-16 04:25:06 +0000 UTC" firstStartedPulling="2026-04-16 04:25:06.578475352 +0000 UTC m=+62.395015411" lastFinishedPulling="2026-04-16 04:25:10.231701922 +0000 UTC m=+66.048241994" observedRunningTime="2026-04-16 04:25:11.121920043 +0000 UTC m=+66.938460146" watchObservedRunningTime="2026-04-16 04:25:11.122772951 +0000 UTC m=+66.939313013" Apr 16 04:25:14.123487 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:14.123449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" event={"ID":"e382fd46-564e-4a07-898c-8f146dfae64b","Type":"ContainerStarted","Data":"511666ba36a0cefd5d3338af5b75c217a2bfbddd2084a46d2a9fdc627e04c1b1"} Apr 16 04:25:14.123487 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:14.123491 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" event={"ID":"e382fd46-564e-4a07-898c-8f146dfae64b","Type":"ContainerStarted","Data":"530a27b0faf6b4a1a4db2b91c63d5c246c1c10ea2b61af68d594f47365e004c8"} Apr 16 04:25:14.124793 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:14.124773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mnzb4" event={"ID":"d17e7eff-b5d0-404c-bf53-695801a18097","Type":"ContainerStarted","Data":"07bab07a59e7356a373ccff2f900a7747d310800448cf77394d7c2e322984250"} Apr 16 04:25:14.124893 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:14.124881 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:25:14.141522 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:14.141489 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" podStartSLOduration=1.294313706 podStartE2EDuration="8.141478663s" podCreationTimestamp="2026-04-16 04:25:06 +0000 UTC" firstStartedPulling="2026-04-16 04:25:06.595434033 +0000 UTC m=+62.411974094" lastFinishedPulling="2026-04-16 04:25:13.442598991 +0000 UTC m=+69.259139051" observedRunningTime="2026-04-16 04:25:14.140217122 +0000 UTC m=+69.956757201" watchObservedRunningTime="2026-04-16 04:25:14.141478663 +0000 UTC m=+69.958018743" Apr 16 04:25:14.153905 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:14.153870 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mnzb4" podStartSLOduration=66.054749487 podStartE2EDuration="1m9.153859581s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="2026-04-16 04:25:10.349825963 +0000 UTC m=+66.166366022" lastFinishedPulling="2026-04-16 04:25:13.448936052 +0000 UTC m=+69.265476116" observedRunningTime="2026-04-16 04:25:14.15339131 +0000 UTC m=+69.969931390" watchObservedRunningTime="2026-04-16 04:25:14.153859581 +0000 UTC m=+69.970399660" Apr 16 04:25:41.477648 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:41.477601 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:25:41.477648 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:41.477658 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:25:41.478124 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:41.477748 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:25:41.478124 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:41.477752 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:25:41.478124 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:41.477808 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert podName:9cf31f3f-502a-403b-a015-b0d26d2ac92f nodeName:}" failed. No retries permitted until 2026-04-16 04:26:45.477793902 +0000 UTC m=+161.294333960 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert") pod "ingress-canary-bntvm" (UID: "9cf31f3f-502a-403b-a015-b0d26d2ac92f") : secret "canary-serving-cert" not found Apr 16 04:25:41.478124 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:25:41.477821 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls podName:4248c02e-a96d-4c4c-a829-8fa7ddd59809 nodeName:}" failed. No retries permitted until 2026-04-16 04:26:45.477815306 +0000 UTC m=+161.294355363 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls") pod "dns-default-zn869" (UID: "4248c02e-a96d-4c4c-a829-8fa7ddd59809") : secret "dns-default-metrics-tls" not found Apr 16 04:25:41.541951 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:41.541926 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s72p4_6630844f-3950-46f3-b23d-355ceec908ec/dns-node-resolver/0.log" Apr 16 04:25:42.340276 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:42.340250 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ctskn_e9738669-31c5-4a1c-8e9b-4f0691464165/node-ca/0.log" Apr 16 04:25:45.129700 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:25:45.129671 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mnzb4" Apr 16 04:26:01.207687 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.207656 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-746fd49c8c-95h62"] Apr 16 04:26:01.212642 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.212624 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.215310 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.215268 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 04:26:01.215463 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.215443 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 04:26:01.215463 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.215451 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 04:26:01.216710 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.216692 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hvkh7\"" Apr 16 04:26:01.220803 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.220764 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 04:26:01.223835 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.223817 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-746fd49c8c-95h62"] Apr 16 04:26:01.240924 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.240899 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9728b"] Apr 16 04:26:01.244081 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.244065 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.246922 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.246896 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 04:26:01.247039 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.247023 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 04:26:01.247309 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.247280 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 04:26:01.247404 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.247387 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 04:26:01.247447 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.247400 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-stqlj\"" Apr 16 04:26:01.253678 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.253657 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9728b"] Apr 16 04:26:01.319366 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb841652-cbca-45d5-970b-df6c93b00e4f-ca-trust-extracted\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.319366 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb841652-cbca-45d5-970b-df6c93b00e4f-registry-tls\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.319580 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fb2875df-44e0-4977-8768-9186ea96a129-data-volume\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.319580 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319464 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cnwx\" (UniqueName: \"kubernetes.io/projected/fb841652-cbca-45d5-970b-df6c93b00e4f-kube-api-access-9cnwx\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.319580 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb841652-cbca-45d5-970b-df6c93b00e4f-bound-sa-token\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.319580 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fb2875df-44e0-4977-8768-9186ea96a129-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.319580 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb841652-cbca-45d5-970b-df6c93b00e4f-trusted-ca\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.319580 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319572 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb841652-cbca-45d5-970b-df6c93b00e4f-installation-pull-secrets\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.319795 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fb841652-cbca-45d5-970b-df6c93b00e4f-image-registry-private-configuration\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.319795 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fb2875df-44e0-4977-8768-9186ea96a129-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.319795 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb841652-cbca-45d5-970b-df6c93b00e4f-registry-certificates\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.319795 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fb2875df-44e0-4977-8768-9186ea96a129-crio-socket\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.319795 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.319716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7lls\" (UniqueName: \"kubernetes.io/projected/fb2875df-44e0-4977-8768-9186ea96a129-kube-api-access-b7lls\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.420435 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb841652-cbca-45d5-970b-df6c93b00e4f-trusted-ca\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.420435 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb841652-cbca-45d5-970b-df6c93b00e4f-installation-pull-secrets\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.420688 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420455 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fb841652-cbca-45d5-970b-df6c93b00e4f-image-registry-private-configuration\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.420688 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fb2875df-44e0-4977-8768-9186ea96a129-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.420688 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb841652-cbca-45d5-970b-df6c93b00e4f-registry-certificates\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.420688 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420638 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fb2875df-44e0-4977-8768-9186ea96a129-crio-socket\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.420688 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7lls\" (UniqueName: \"kubernetes.io/projected/fb2875df-44e0-4977-8768-9186ea96a129-kube-api-access-b7lls\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.420932 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb841652-cbca-45d5-970b-df6c93b00e4f-ca-trust-extracted\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.420932 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb841652-cbca-45d5-970b-df6c93b00e4f-registry-tls\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.420932 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fb2875df-44e0-4977-8768-9186ea96a129-data-volume\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.420932 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cnwx\" (UniqueName: \"kubernetes.io/projected/fb841652-cbca-45d5-970b-df6c93b00e4f-kube-api-access-9cnwx\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.420932 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb841652-cbca-45d5-970b-df6c93b00e4f-bound-sa-token\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.420932 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.420883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fb2875df-44e0-4977-8768-9186ea96a129-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.421220 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.421042 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fb2875df-44e0-4977-8768-9186ea96a129-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.421274 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.421249 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fb2875df-44e0-4977-8768-9186ea96a129-data-volume\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.421523 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.421499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fb2875df-44e0-4977-8768-9186ea96a129-crio-socket\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.421617 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.421501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb841652-cbca-45d5-970b-df6c93b00e4f-trusted-ca\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.421672 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.421618 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb841652-cbca-45d5-970b-df6c93b00e4f-ca-trust-extracted\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.422186 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.422135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb841652-cbca-45d5-970b-df6c93b00e4f-registry-certificates\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.423160 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.423142 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb841652-cbca-45d5-970b-df6c93b00e4f-installation-pull-secrets\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.423239 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.423216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fb2875df-44e0-4977-8768-9186ea96a129-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.423288 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.423231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fb841652-cbca-45d5-970b-df6c93b00e4f-image-registry-private-configuration\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.423617 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.423600 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb841652-cbca-45d5-970b-df6c93b00e4f-registry-tls\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.427613 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.427583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7lls\" (UniqueName: \"kubernetes.io/projected/fb2875df-44e0-4977-8768-9186ea96a129-kube-api-access-b7lls\") pod \"insights-runtime-extractor-9728b\" (UID: \"fb2875df-44e0-4977-8768-9186ea96a129\") " pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.427771 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.427755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cnwx\" (UniqueName: \"kubernetes.io/projected/fb841652-cbca-45d5-970b-df6c93b00e4f-kube-api-access-9cnwx\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.428216 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.428197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb841652-cbca-45d5-970b-df6c93b00e4f-bound-sa-token\") pod \"image-registry-746fd49c8c-95h62\" (UID: \"fb841652-cbca-45d5-970b-df6c93b00e4f\") " pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.521756 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.521723 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:01.553075 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.553037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9728b" Apr 16 04:26:01.643358 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.643324 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-746fd49c8c-95h62"] Apr 16 04:26:01.646976 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:26:01.646944 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb841652_cbca_45d5_970b_df6c93b00e4f.slice/crio-45efc0012099ed43c8f0dbb2a943e8c9d33c5fe7e5aedc2e333b073c3308ebc1 WatchSource:0}: Error finding container 45efc0012099ed43c8f0dbb2a943e8c9d33c5fe7e5aedc2e333b073c3308ebc1: Status 404 returned error can't find the container with id 45efc0012099ed43c8f0dbb2a943e8c9d33c5fe7e5aedc2e333b073c3308ebc1 Apr 16 04:26:01.680683 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:01.680656 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9728b"] Apr 16 04:26:01.683637 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:26:01.683605 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2875df_44e0_4977_8768_9186ea96a129.slice/crio-cdbf46e297f35e79e5b537449c3a0e7711b93b5333c4a0eca34a1e7b6d4776e6 WatchSource:0}: Error finding container cdbf46e297f35e79e5b537449c3a0e7711b93b5333c4a0eca34a1e7b6d4776e6: Status 404 returned error can't find the container with id cdbf46e297f35e79e5b537449c3a0e7711b93b5333c4a0eca34a1e7b6d4776e6 Apr 16 04:26:02.239828 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:02.239792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9728b" event={"ID":"fb2875df-44e0-4977-8768-9186ea96a129","Type":"ContainerStarted","Data":"b45f126dcf88695a7b43331d24ad9cba40c330365a027eb8aee9e09ca41504aa"} Apr 16 04:26:02.239828 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:02.239831 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9728b" event={"ID":"fb2875df-44e0-4977-8768-9186ea96a129","Type":"ContainerStarted","Data":"cdbf46e297f35e79e5b537449c3a0e7711b93b5333c4a0eca34a1e7b6d4776e6"} Apr 16 04:26:02.241044 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:02.241018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-746fd49c8c-95h62" event={"ID":"fb841652-cbca-45d5-970b-df6c93b00e4f","Type":"ContainerStarted","Data":"0e220bad821f9d25781e58e47ce31757dc8d3ed8bfcb3c51a2cf24cb5eaadf40"} Apr 16 04:26:02.241137 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:02.241049 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-746fd49c8c-95h62" event={"ID":"fb841652-cbca-45d5-970b-df6c93b00e4f","Type":"ContainerStarted","Data":"45efc0012099ed43c8f0dbb2a943e8c9d33c5fe7e5aedc2e333b073c3308ebc1"} Apr 16 04:26:02.241180 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:02.241161 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:02.258986 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:02.258948 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-746fd49c8c-95h62" podStartSLOduration=1.258935087 podStartE2EDuration="1.258935087s" podCreationTimestamp="2026-04-16 04:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:26:02.258402219 +0000 UTC m=+118.074942303" watchObservedRunningTime="2026-04-16 04:26:02.258935087 +0000 UTC m=+118.075475167" Apr 16 04:26:03.244976 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:03.244925 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9728b" event={"ID":"fb2875df-44e0-4977-8768-9186ea96a129","Type":"ContainerStarted","Data":"8af879a6a1387373dce307c69a2d1fbf6e140e8a7de7a27962e2184fe97b75b0"} Apr 16 04:26:04.248466 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:04.248434 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9728b" event={"ID":"fb2875df-44e0-4977-8768-9186ea96a129","Type":"ContainerStarted","Data":"d26225a1bb562ff5b80b3fb297f56dd9017f65abdddcad38b1269cf0c24bae5d"} Apr 16 04:26:04.265978 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:04.265935 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9728b" podStartSLOduration=1.059284178 podStartE2EDuration="3.265920254s" podCreationTimestamp="2026-04-16 04:26:01 +0000 UTC" firstStartedPulling="2026-04-16 04:26:01.742978112 +0000 UTC m=+117.559518184" lastFinishedPulling="2026-04-16 04:26:03.949614203 +0000 UTC m=+119.766154260" observedRunningTime="2026-04-16 04:26:04.264156319 +0000 UTC m=+120.080696398" watchObservedRunningTime="2026-04-16 04:26:04.265920254 +0000 UTC m=+120.082460334" Apr 16 04:26:13.512962 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:13.512922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:26:13.515151 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:13.515119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30a87b4-65a2-4504-be52-b10fb247dedb-metrics-certs\") pod \"network-metrics-daemon-qhcj5\" (UID: \"b30a87b4-65a2-4504-be52-b10fb247dedb\") " pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:26:13.518458 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:13.518441 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xfxbt\"" Apr 16 04:26:13.526637 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:13.526613 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhcj5" Apr 16 04:26:13.635589 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:13.635559 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qhcj5"] Apr 16 04:26:13.639484 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:26:13.639461 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb30a87b4_65a2_4504_be52_b10fb247dedb.slice/crio-924ccb5de76e562d82e9912c0c84f9e922d5a7de249ddc355a68582068063bfc WatchSource:0}: Error finding container 924ccb5de76e562d82e9912c0c84f9e922d5a7de249ddc355a68582068063bfc: Status 404 returned error can't find the container with id 924ccb5de76e562d82e9912c0c84f9e922d5a7de249ddc355a68582068063bfc Apr 16 04:26:14.275140 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:14.275102 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qhcj5" event={"ID":"b30a87b4-65a2-4504-be52-b10fb247dedb","Type":"ContainerStarted","Data":"924ccb5de76e562d82e9912c0c84f9e922d5a7de249ddc355a68582068063bfc"} Apr 16 04:26:15.280054 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:15.280012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qhcj5" event={"ID":"b30a87b4-65a2-4504-be52-b10fb247dedb","Type":"ContainerStarted","Data":"a071197f784ff8331fe26ef3fdd78f9b548f65d0bbf4d6fe65cf01eec997b34d"} Apr 16 04:26:15.280054 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:15.280059 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qhcj5" event={"ID":"b30a87b4-65a2-4504-be52-b10fb247dedb","Type":"ContainerStarted","Data":"3adee2384e882b8e61e859393381dbad02e8e670c60be026af9857897e909495"} Apr 16 04:26:15.295704 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:15.295655 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qhcj5" podStartSLOduration=130.317301327 podStartE2EDuration="2m11.295640673s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:26:13.64135792 +0000 UTC m=+129.457897978" lastFinishedPulling="2026-04-16 04:26:14.619697252 +0000 UTC m=+130.436237324" observedRunningTime="2026-04-16 04:26:15.294887376 +0000 UTC m=+131.111427457" watchObservedRunningTime="2026-04-16 04:26:15.295640673 +0000 UTC m=+131.112180753" Apr 16 04:26:19.764452 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.764417 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-56vp9"] Apr 16 04:26:19.767553 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.767531 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.770056 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.770032 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 04:26:19.770224 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.770209 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 04:26:19.770336 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.770318 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 04:26:19.770408 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.770386 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 04:26:19.770463 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.770449 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 04:26:19.771226 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.771207 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 04:26:19.771360 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.771216 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s26sx\"" Apr 16 04:26:19.859798 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.859775 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/302d14c0-18e8-4940-948c-3b0fdf2bba1b-sys\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.859916 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.859813 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-textfile\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.859916 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.859840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.859995 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.859919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/302d14c0-18e8-4940-948c-3b0fdf2bba1b-root\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.859995 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.859949 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-tls\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.859995 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.859969 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-accelerators-collector-config\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.859995 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.859987 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjsrs\" (UniqueName: \"kubernetes.io/projected/302d14c0-18e8-4940-948c-3b0fdf2bba1b-kube-api-access-wjsrs\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.860117 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.860011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/302d14c0-18e8-4940-948c-3b0fdf2bba1b-metrics-client-ca\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.860117 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.860035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-wtmp\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960361 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960317 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/302d14c0-18e8-4940-948c-3b0fdf2bba1b-root\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960530 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-tls\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960530 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-accelerators-collector-config\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960530 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960395 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/302d14c0-18e8-4940-948c-3b0fdf2bba1b-root\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960530 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjsrs\" (UniqueName: \"kubernetes.io/projected/302d14c0-18e8-4940-948c-3b0fdf2bba1b-kube-api-access-wjsrs\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960530 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960445 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/302d14c0-18e8-4940-948c-3b0fdf2bba1b-metrics-client-ca\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960530 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:26:19.960515 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 04:26:19.960812 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-wtmp\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960812 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:26:19.960589 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-tls podName:302d14c0-18e8-4940-948c-3b0fdf2bba1b nodeName:}" failed. No retries permitted until 2026-04-16 04:26:20.460567808 +0000 UTC m=+136.277107865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-tls") pod "node-exporter-56vp9" (UID: "302d14c0-18e8-4940-948c-3b0fdf2bba1b") : secret "node-exporter-tls" not found Apr 16 04:26:19.960812 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/302d14c0-18e8-4940-948c-3b0fdf2bba1b-sys\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960812 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-textfile\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960812 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960690 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-wtmp\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960812 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/302d14c0-18e8-4940-948c-3b0fdf2bba1b-sys\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.960812 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.961147 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.960966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/302d14c0-18e8-4940-948c-3b0fdf2bba1b-metrics-client-ca\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.961147 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.961019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-accelerators-collector-config\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.961147 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.961065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-textfile\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.962894 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.962876 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:19.969774 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:19.969755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjsrs\" (UniqueName: \"kubernetes.io/projected/302d14c0-18e8-4940-948c-3b0fdf2bba1b-kube-api-access-wjsrs\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:20.462871 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:20.462828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-tls\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:20.465085 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:20.465061 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/302d14c0-18e8-4940-948c-3b0fdf2bba1b-node-exporter-tls\") pod \"node-exporter-56vp9\" (UID: \"302d14c0-18e8-4940-948c-3b0fdf2bba1b\") " pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:20.676905 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:20.676867 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-56vp9" Apr 16 04:26:20.684399 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:26:20.684368 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod302d14c0_18e8_4940_948c_3b0fdf2bba1b.slice/crio-d88ee8ba45861de4d5b81e6786b742ff18984eba1a4b21162a1864639b4a9279 WatchSource:0}: Error finding container d88ee8ba45861de4d5b81e6786b742ff18984eba1a4b21162a1864639b4a9279: Status 404 returned error can't find the container with id d88ee8ba45861de4d5b81e6786b742ff18984eba1a4b21162a1864639b4a9279 Apr 16 04:26:21.295583 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:21.295547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-56vp9" event={"ID":"302d14c0-18e8-4940-948c-3b0fdf2bba1b","Type":"ContainerStarted","Data":"d88ee8ba45861de4d5b81e6786b742ff18984eba1a4b21162a1864639b4a9279"} Apr 16 04:26:22.298885 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:22.298843 2578 generic.go:358] "Generic (PLEG): container finished" podID="302d14c0-18e8-4940-948c-3b0fdf2bba1b" containerID="01101462555ebdcc5400bd3f4fd11c38be33baf85e2371732bb7ef3cf0fe7fce" exitCode=0 Apr 16 04:26:22.299322 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:22.298932 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-56vp9" event={"ID":"302d14c0-18e8-4940-948c-3b0fdf2bba1b","Type":"ContainerDied","Data":"01101462555ebdcc5400bd3f4fd11c38be33baf85e2371732bb7ef3cf0fe7fce"} Apr 16 04:26:23.249486 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:23.249459 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-746fd49c8c-95h62" Apr 16 04:26:23.302977 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:23.302943 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-56vp9" event={"ID":"302d14c0-18e8-4940-948c-3b0fdf2bba1b","Type":"ContainerStarted","Data":"e6cb3fd75551ed7c2223d4ad628a87f6a2a5f8b95dda9fca99778d02d1735d1d"} Apr 16 04:26:23.302977 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:23.302983 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-56vp9" event={"ID":"302d14c0-18e8-4940-948c-3b0fdf2bba1b","Type":"ContainerStarted","Data":"1dd267a35f92bc1e1a877971a45deba0d221673ae831e68f7efa42ef85e1737a"} Apr 16 04:26:23.321467 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:23.321417 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-56vp9" podStartSLOduration=3.595961714 podStartE2EDuration="4.321403547s" podCreationTimestamp="2026-04-16 04:26:19 +0000 UTC" firstStartedPulling="2026-04-16 04:26:20.686033712 +0000 UTC m=+136.502573769" lastFinishedPulling="2026-04-16 04:26:21.41147554 +0000 UTC m=+137.228015602" observedRunningTime="2026-04-16 04:26:23.320356234 +0000 UTC m=+139.136896312" watchObservedRunningTime="2026-04-16 04:26:23.321403547 +0000 UTC m=+139.137943664" Apr 16 04:26:24.053980 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.053945 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7f76d64bc4-h4kxs"] Apr 16 04:26:24.056840 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.056825 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.059432 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.059409 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 04:26:24.059432 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.059411 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 04:26:24.060513 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.060487 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-73k1o6t64nd2r\"" Apr 16 04:26:24.060513 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.060506 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 04:26:24.060677 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.060550 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 04:26:24.060677 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.060567 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-sxmx6\"" Apr 16 04:26:24.067471 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.067451 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f76d64bc4-h4kxs"] Apr 16 04:26:24.089198 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.089173 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-metrics-server-audit-profiles\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.089339 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.089205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-audit-log\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.089339 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.089225 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-secret-metrics-server-tls\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.089339 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.089257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pswh\" (UniqueName: \"kubernetes.io/projected/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-kube-api-access-5pswh\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.089339 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.089276 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-secret-metrics-server-client-certs\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.089480 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.089358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.089480 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.089378 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-client-ca-bundle\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.190521 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.190487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pswh\" (UniqueName: \"kubernetes.io/projected/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-kube-api-access-5pswh\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.190521 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.190526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-secret-metrics-server-client-certs\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.190704 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.190608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.190704 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.190651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-client-ca-bundle\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.190704 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.190678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-metrics-server-audit-profiles\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.190815 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.190703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-audit-log\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.190880 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.190858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-secret-metrics-server-tls\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.191883 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.191853 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.192004 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.191973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-metrics-server-audit-profiles\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.193969 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.193938 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-secret-metrics-server-client-certs\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.193969 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.193952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-client-ca-bundle\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.196438 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.196367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-audit-log\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.196999 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.196978 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-secret-metrics-server-tls\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.199021 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.198994 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pswh\" (UniqueName: \"kubernetes.io/projected/f703fc51-0bf6-45d6-8fff-2bce418bb4c4-kube-api-access-5pswh\") pod \"metrics-server-7f76d64bc4-h4kxs\" (UID: \"f703fc51-0bf6-45d6-8fff-2bce418bb4c4\") " pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.365807 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.365729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:24.485230 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:24.485197 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f76d64bc4-h4kxs"] Apr 16 04:26:24.488405 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:26:24.488380 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf703fc51_0bf6_45d6_8fff_2bce418bb4c4.slice/crio-37706e9e1ae9924829d5cb0383e327f9abc9bcdd3d5d0bdbb6b70214210be270 WatchSource:0}: Error finding container 37706e9e1ae9924829d5cb0383e327f9abc9bcdd3d5d0bdbb6b70214210be270: Status 404 returned error can't find the container with id 37706e9e1ae9924829d5cb0383e327f9abc9bcdd3d5d0bdbb6b70214210be270 Apr 16 04:26:25.309635 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:25.309592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" event={"ID":"f703fc51-0bf6-45d6-8fff-2bce418bb4c4","Type":"ContainerStarted","Data":"37706e9e1ae9924829d5cb0383e327f9abc9bcdd3d5d0bdbb6b70214210be270"} Apr 16 04:26:26.317680 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:26.317585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" event={"ID":"f703fc51-0bf6-45d6-8fff-2bce418bb4c4","Type":"ContainerStarted","Data":"ae57e469c57234328ba28cc04527f36b41a7d4262b126270064e3d2d9c03924c"} Apr 16 04:26:26.333539 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:26.333491 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" podStartSLOduration=0.843160277 podStartE2EDuration="2.333474774s" podCreationTimestamp="2026-04-16 04:26:24 +0000 UTC" firstStartedPulling="2026-04-16 04:26:24.490622172 +0000 UTC m=+140.307162233" lastFinishedPulling="2026-04-16 04:26:25.98093667 +0000 UTC m=+141.797476730" observedRunningTime="2026-04-16 04:26:26.332744591 +0000 UTC m=+142.149284668" watchObservedRunningTime="2026-04-16 04:26:26.333474774 +0000 UTC m=+142.150014857" Apr 16 04:26:36.474909 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:36.474870 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" podUID="e382fd46-564e-4a07-898c-8f146dfae64b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 04:26:40.585617 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:26:40.585572 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-zn869" podUID="4248c02e-a96d-4c4c-a829-8fa7ddd59809" Apr 16 04:26:40.600933 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:26:40.600901 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bntvm" podUID="9cf31f3f-502a-403b-a015-b0d26d2ac92f" Apr 16 04:26:41.355057 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:41.355030 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zn869" Apr 16 04:26:44.366843 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:44.366774 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:44.366843 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:44.366811 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:26:45.550350 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:45.550318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:26:45.550813 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:45.550363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:26:45.552759 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:45.552733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cf31f3f-502a-403b-a015-b0d26d2ac92f-cert\") pod \"ingress-canary-bntvm\" (UID: \"9cf31f3f-502a-403b-a015-b0d26d2ac92f\") " pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:26:45.553229 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:45.553210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4248c02e-a96d-4c4c-a829-8fa7ddd59809-metrics-tls\") pod \"dns-default-zn869\" (UID: \"4248c02e-a96d-4c4c-a829-8fa7ddd59809\") " pod="openshift-dns/dns-default-zn869" Apr 16 04:26:45.557844 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:45.557825 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-smz8d\"" Apr 16 04:26:45.565876 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:45.565861 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zn869" Apr 16 04:26:45.677924 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:45.677855 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zn869"] Apr 16 04:26:45.680255 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:26:45.680227 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4248c02e_a96d_4c4c_a829_8fa7ddd59809.slice/crio-43be2c9a9f10058e61ed0d09e3ec3a8aa2889141fff02d5d6fa7382f9c2baf40 WatchSource:0}: Error finding container 43be2c9a9f10058e61ed0d09e3ec3a8aa2889141fff02d5d6fa7382f9c2baf40: Status 404 returned error can't find the container with id 43be2c9a9f10058e61ed0d09e3ec3a8aa2889141fff02d5d6fa7382f9c2baf40 Apr 16 04:26:46.367827 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:46.367788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zn869" event={"ID":"4248c02e-a96d-4c4c-a829-8fa7ddd59809","Type":"ContainerStarted","Data":"43be2c9a9f10058e61ed0d09e3ec3a8aa2889141fff02d5d6fa7382f9c2baf40"} Apr 16 04:26:46.475170 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:46.475128 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" podUID="e382fd46-564e-4a07-898c-8f146dfae64b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 04:26:47.372272 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:47.372237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zn869" event={"ID":"4248c02e-a96d-4c4c-a829-8fa7ddd59809","Type":"ContainerStarted","Data":"390a7b6a5891e8b55ff3841c43f3f3a9f1e7bcea3cef604121f53d4a966db2c3"} Apr 16 04:26:47.372272 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:47.372272 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zn869" event={"ID":"4248c02e-a96d-4c4c-a829-8fa7ddd59809","Type":"ContainerStarted","Data":"2c9b17707b30a0fb5137af3d0263c2c75a8f287142b5a0424b72438d45c06f29"} Apr 16 04:26:47.372708 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:47.372312 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zn869" Apr 16 04:26:47.390314 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:47.389992 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zn869" podStartSLOduration=129.203096398 podStartE2EDuration="2m10.389973875s" podCreationTimestamp="2026-04-16 04:24:37 +0000 UTC" firstStartedPulling="2026-04-16 04:26:45.681988429 +0000 UTC m=+161.498528487" lastFinishedPulling="2026-04-16 04:26:46.868865885 +0000 UTC m=+162.685405964" observedRunningTime="2026-04-16 04:26:47.389251474 +0000 UTC m=+163.205791575" watchObservedRunningTime="2026-04-16 04:26:47.389973875 +0000 UTC m=+163.206513959" Apr 16 04:26:54.788175 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:54.788148 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:26:54.791221 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:54.791204 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-d4mbv\"" Apr 16 04:26:54.799324 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:54.799309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bntvm" Apr 16 04:26:54.915755 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:54.914271 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bntvm"] Apr 16 04:26:54.918538 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:26:54.918509 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cf31f3f_502a_403b_a015_b0d26d2ac92f.slice/crio-40e59487df8febf6b6e5eb7ecc6acb2e0b75494e994f9075a413a666c02f29fe WatchSource:0}: Error finding container 40e59487df8febf6b6e5eb7ecc6acb2e0b75494e994f9075a413a666c02f29fe: Status 404 returned error can't find the container with id 40e59487df8febf6b6e5eb7ecc6acb2e0b75494e994f9075a413a666c02f29fe Apr 16 04:26:55.395134 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:55.395100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bntvm" event={"ID":"9cf31f3f-502a-403b-a015-b0d26d2ac92f","Type":"ContainerStarted","Data":"40e59487df8febf6b6e5eb7ecc6acb2e0b75494e994f9075a413a666c02f29fe"} Apr 16 04:26:56.474882 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:56.474839 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" podUID="e382fd46-564e-4a07-898c-8f146dfae64b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 04:26:56.475254 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:56.474909 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" Apr 16 04:26:56.475429 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:56.475397 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"511666ba36a0cefd5d3338af5b75c217a2bfbddd2084a46d2a9fdc627e04c1b1"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 04:26:56.475476 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:56.475462 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" podUID="e382fd46-564e-4a07-898c-8f146dfae64b" containerName="service-proxy" containerID="cri-o://511666ba36a0cefd5d3338af5b75c217a2bfbddd2084a46d2a9fdc627e04c1b1" gracePeriod=30 Apr 16 04:26:57.376927 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:57.376892 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zn869" Apr 16 04:26:57.402212 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:57.402180 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bntvm" event={"ID":"9cf31f3f-502a-403b-a015-b0d26d2ac92f","Type":"ContainerStarted","Data":"f660829917fd084ae435f81e0b69dadea39cdf4dd2928064a4ef3adfb4aab677"} Apr 16 04:26:57.404746 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:57.404719 2578 generic.go:358] "Generic (PLEG): container finished" podID="e382fd46-564e-4a07-898c-8f146dfae64b" containerID="511666ba36a0cefd5d3338af5b75c217a2bfbddd2084a46d2a9fdc627e04c1b1" exitCode=2 Apr 16 04:26:57.404883 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:57.404756 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" event={"ID":"e382fd46-564e-4a07-898c-8f146dfae64b","Type":"ContainerDied","Data":"511666ba36a0cefd5d3338af5b75c217a2bfbddd2084a46d2a9fdc627e04c1b1"} Apr 16 04:26:57.404883 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:57.404780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f8bc7bf7c-792r6" event={"ID":"e382fd46-564e-4a07-898c-8f146dfae64b","Type":"ContainerStarted","Data":"93df6612bdcc0059cafa89dd6f16929729978e07895cc558ecc6b67db70591ab"} Apr 16 04:26:57.418118 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:26:57.418049 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bntvm" podStartSLOduration=138.859599791 podStartE2EDuration="2m20.418035634s" podCreationTimestamp="2026-04-16 04:24:37 +0000 UTC" firstStartedPulling="2026-04-16 04:26:54.920179033 +0000 UTC m=+170.736719090" lastFinishedPulling="2026-04-16 04:26:56.478614873 +0000 UTC m=+172.295154933" observedRunningTime="2026-04-16 04:26:57.416857852 +0000 UTC m=+173.233397932" watchObservedRunningTime="2026-04-16 04:26:57.418035634 +0000 UTC m=+173.234575744" Apr 16 04:27:04.371290 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:27:04.371254 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:27:04.374993 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:27:04.374969 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7f76d64bc4-h4kxs" Apr 16 04:28:31.814507 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.814428 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr"] Apr 16 04:28:31.817566 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.817550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:31.820059 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.820038 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 04:28:31.821212 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.821192 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v68fx\"" Apr 16 04:28:31.821341 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.821237 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 04:28:31.826237 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.826212 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr"] Apr 16 04:28:31.896104 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.896077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf8s4\" (UniqueName: \"kubernetes.io/projected/bf0963b1-5128-4dd5-9e51-4df58657234b-kube-api-access-tf8s4\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:31.896237 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.896124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:31.896237 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.896220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:31.996723 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.996695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:31.996851 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.996737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf8s4\" (UniqueName: \"kubernetes.io/projected/bf0963b1-5128-4dd5-9e51-4df58657234b-kube-api-access-tf8s4\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:31.996851 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.996763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:31.997063 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.997043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:31.997103 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:31.997064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:32.005399 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:32.005377 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf8s4\" (UniqueName: \"kubernetes.io/projected/bf0963b1-5128-4dd5-9e51-4df58657234b-kube-api-access-tf8s4\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:32.126693 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:32.126608 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:32.239777 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:32.239744 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr"] Apr 16 04:28:32.243699 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:28:32.243671 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0963b1_5128_4dd5_9e51_4df58657234b.slice/crio-d06a1fa4397f02feb138cd496aec60daee09a7fca7430be603bd37d61be57114 WatchSource:0}: Error finding container d06a1fa4397f02feb138cd496aec60daee09a7fca7430be603bd37d61be57114: Status 404 returned error can't find the container with id d06a1fa4397f02feb138cd496aec60daee09a7fca7430be603bd37d61be57114 Apr 16 04:28:32.642581 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:32.642530 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" event={"ID":"bf0963b1-5128-4dd5-9e51-4df58657234b","Type":"ContainerStarted","Data":"d06a1fa4397f02feb138cd496aec60daee09a7fca7430be603bd37d61be57114"} Apr 16 04:28:38.658875 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:38.658847 2578 generic.go:358] "Generic (PLEG): container finished" podID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerID="1665e86bea215b9990791b7979783ed4d974849599dd4770f6191a40bfcbae6c" exitCode=0 Apr 16 04:28:38.659244 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:38.658949 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" event={"ID":"bf0963b1-5128-4dd5-9e51-4df58657234b","Type":"ContainerDied","Data":"1665e86bea215b9990791b7979783ed4d974849599dd4770f6191a40bfcbae6c"} Apr 16 04:28:40.665360 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:40.665327 2578 generic.go:358] "Generic (PLEG): container finished" podID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerID="d81d0c9b793c605fec7e15aa38423103763cadacd01eabdebb21f03d54590a3a" exitCode=0 Apr 16 04:28:40.665722 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:40.665383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" event={"ID":"bf0963b1-5128-4dd5-9e51-4df58657234b","Type":"ContainerDied","Data":"d81d0c9b793c605fec7e15aa38423103763cadacd01eabdebb21f03d54590a3a"} Apr 16 04:28:46.686373 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:46.686281 2578 generic.go:358] "Generic (PLEG): container finished" podID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerID="32dd9d92df093a81da0ca7d9853aa93dae4a3cb094190ae248e0cfacf75a89f9" exitCode=0 Apr 16 04:28:46.686373 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:46.686329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" event={"ID":"bf0963b1-5128-4dd5-9e51-4df58657234b","Type":"ContainerDied","Data":"32dd9d92df093a81da0ca7d9853aa93dae4a3cb094190ae248e0cfacf75a89f9"} Apr 16 04:28:47.802655 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:47.802629 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:28:47.922916 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:47.922869 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-bundle\") pod \"bf0963b1-5128-4dd5-9e51-4df58657234b\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " Apr 16 04:28:47.922916 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:47.922918 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-util\") pod \"bf0963b1-5128-4dd5-9e51-4df58657234b\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " Apr 16 04:28:47.923144 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:47.922949 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf8s4\" (UniqueName: \"kubernetes.io/projected/bf0963b1-5128-4dd5-9e51-4df58657234b-kube-api-access-tf8s4\") pod \"bf0963b1-5128-4dd5-9e51-4df58657234b\" (UID: \"bf0963b1-5128-4dd5-9e51-4df58657234b\") " Apr 16 04:28:47.923485 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:47.923451 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-bundle" (OuterVolumeSpecName: "bundle") pod "bf0963b1-5128-4dd5-9e51-4df58657234b" (UID: "bf0963b1-5128-4dd5-9e51-4df58657234b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:28:47.925129 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:47.925104 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0963b1-5128-4dd5-9e51-4df58657234b-kube-api-access-tf8s4" (OuterVolumeSpecName: "kube-api-access-tf8s4") pod "bf0963b1-5128-4dd5-9e51-4df58657234b" (UID: "bf0963b1-5128-4dd5-9e51-4df58657234b"). InnerVolumeSpecName "kube-api-access-tf8s4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:28:47.926879 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:47.926856 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-util" (OuterVolumeSpecName: "util") pod "bf0963b1-5128-4dd5-9e51-4df58657234b" (UID: "bf0963b1-5128-4dd5-9e51-4df58657234b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:28:48.023683 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:48.023641 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-bundle\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:28:48.023683 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:48.023680 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0963b1-5128-4dd5-9e51-4df58657234b-util\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:28:48.023683 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:48.023690 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tf8s4\" (UniqueName: \"kubernetes.io/projected/bf0963b1-5128-4dd5-9e51-4df58657234b-kube-api-access-tf8s4\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:28:48.692718 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:48.692685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" event={"ID":"bf0963b1-5128-4dd5-9e51-4df58657234b","Type":"ContainerDied","Data":"d06a1fa4397f02feb138cd496aec60daee09a7fca7430be603bd37d61be57114"} Apr 16 04:28:48.692718 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:48.692721 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d06a1fa4397f02feb138cd496aec60daee09a7fca7430be603bd37d61be57114" Apr 16 04:28:48.692914 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:28:48.692731 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lh4vr" Apr 16 04:29:04.684657 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:04.684626 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:29:04.685190 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:04.684850 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:29:04.690465 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:04.690440 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 04:29:24.989512 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.989479 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm"] Apr 16 04:29:24.991619 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.989731 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerName="util" Apr 16 04:29:24.991619 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.989743 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerName="util" Apr 16 04:29:24.991619 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.989754 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerName="pull" Apr 16 04:29:24.991619 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.989760 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerName="pull" Apr 16 04:29:24.991619 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.989772 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerName="extract" Apr 16 04:29:24.991619 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.989778 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerName="extract" Apr 16 04:29:24.991619 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.989824 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf0963b1-5128-4dd5-9e51-4df58657234b" containerName="extract" Apr 16 04:29:24.992437 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.992419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:24.994923 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.994893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v68fx\"" Apr 16 04:29:24.995755 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.995741 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 04:29:24.995944 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.995929 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 04:29:24.998759 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:24.998699 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm"] Apr 16 04:29:25.089763 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.089726 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.089918 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.089781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.089918 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.089814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntd2m\" (UniqueName: \"kubernetes.io/projected/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-kube-api-access-ntd2m\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.191129 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.191088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.191129 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.191131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.191384 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.191153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntd2m\" (UniqueName: \"kubernetes.io/projected/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-kube-api-access-ntd2m\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.191534 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.191513 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.191569 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.191541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.199014 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.198991 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntd2m\" (UniqueName: \"kubernetes.io/projected/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-kube-api-access-ntd2m\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.301217 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.301115 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:25.416846 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.416822 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm"] Apr 16 04:29:25.419186 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:29:25.419160 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf11d76de_0b65_4e5b_aa9d_3cc52eb67a3d.slice/crio-96539ad9efc643e0eabc35d3fe35a00043bfc045bb4abce5d2dfe3a2e3e13a6f WatchSource:0}: Error finding container 96539ad9efc643e0eabc35d3fe35a00043bfc045bb4abce5d2dfe3a2e3e13a6f: Status 404 returned error can't find the container with id 96539ad9efc643e0eabc35d3fe35a00043bfc045bb4abce5d2dfe3a2e3e13a6f Apr 16 04:29:25.421060 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.421043 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:29:25.785406 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.785375 2578 generic.go:358] "Generic (PLEG): container finished" podID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerID="b329d7943378c31dcc20aa0b6e3ae1b26ff787b66ddfef50c35fbe701fb4cb23" exitCode=0 Apr 16 04:29:25.785557 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.785462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" event={"ID":"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d","Type":"ContainerDied","Data":"b329d7943378c31dcc20aa0b6e3ae1b26ff787b66ddfef50c35fbe701fb4cb23"} Apr 16 04:29:25.785557 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:25.785496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" event={"ID":"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d","Type":"ContainerStarted","Data":"96539ad9efc643e0eabc35d3fe35a00043bfc045bb4abce5d2dfe3a2e3e13a6f"} Apr 16 04:29:26.793653 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:26.793624 2578 generic.go:358] "Generic (PLEG): container finished" podID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerID="dd901363ad272dce647c1c8e9f52d784da4f43071ad7cc72d1f6f9a669085a03" exitCode=0 Apr 16 04:29:26.794040 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:26.793914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" event={"ID":"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d","Type":"ContainerDied","Data":"dd901363ad272dce647c1c8e9f52d784da4f43071ad7cc72d1f6f9a669085a03"} Apr 16 04:29:27.797661 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:27.797629 2578 generic.go:358] "Generic (PLEG): container finished" podID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerID="0d510d3ee8f2de6cc389a7019bcbde3a0c0f3c3246f5353f2683deca9ff73466" exitCode=0 Apr 16 04:29:27.798052 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:27.797702 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" event={"ID":"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d","Type":"ContainerDied","Data":"0d510d3ee8f2de6cc389a7019bcbde3a0c0f3c3246f5353f2683deca9ff73466"} Apr 16 04:29:28.914829 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:28.914807 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:29.022045 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.022006 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-bundle\") pod \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " Apr 16 04:29:29.022045 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.022049 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntd2m\" (UniqueName: \"kubernetes.io/projected/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-kube-api-access-ntd2m\") pod \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " Apr 16 04:29:29.022327 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.022113 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-util\") pod \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\" (UID: \"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d\") " Apr 16 04:29:29.022779 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.022751 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-bundle" (OuterVolumeSpecName: "bundle") pod "f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" (UID: "f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:29:29.024257 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.024234 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-kube-api-access-ntd2m" (OuterVolumeSpecName: "kube-api-access-ntd2m") pod "f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" (UID: "f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d"). InnerVolumeSpecName "kube-api-access-ntd2m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:29:29.027655 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.027631 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-util" (OuterVolumeSpecName: "util") pod "f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" (UID: "f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:29:29.122779 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.122686 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-util\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:29:29.122779 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.122752 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-bundle\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:29:29.122779 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.122764 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntd2m\" (UniqueName: \"kubernetes.io/projected/f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d-kube-api-access-ntd2m\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:29:29.804365 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.804335 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" Apr 16 04:29:29.804536 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.804335 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5lfshm" event={"ID":"f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d","Type":"ContainerDied","Data":"96539ad9efc643e0eabc35d3fe35a00043bfc045bb4abce5d2dfe3a2e3e13a6f"} Apr 16 04:29:29.804536 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:29.804444 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96539ad9efc643e0eabc35d3fe35a00043bfc045bb4abce5d2dfe3a2e3e13a6f" Apr 16 04:29:38.811733 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.811702 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z"] Apr 16 04:29:38.812194 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.811908 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerName="util" Apr 16 04:29:38.812194 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.811919 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerName="util" Apr 16 04:29:38.812194 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.811933 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerName="pull" Apr 16 04:29:38.812194 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.811938 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerName="pull" Apr 16 04:29:38.812194 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.811945 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerName="extract" Apr 16 04:29:38.812194 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.811951 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerName="extract" Apr 16 04:29:38.812194 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.811990 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f11d76de-0b65-4e5b-aa9d-3cc52eb67a3d" containerName="extract" Apr 16 04:29:38.816114 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.816097 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:38.818693 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.818671 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 04:29:38.818830 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.818672 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 04:29:38.818830 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.818677 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v68fx\"" Apr 16 04:29:38.823051 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.822625 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z"] Apr 16 04:29:38.998093 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.998058 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlgdb\" (UniqueName: \"kubernetes.io/projected/140bcda8-07de-4ffc-864e-bb4f6282940b-kube-api-access-tlgdb\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:38.998256 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.998139 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:38.998256 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:38.998156 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:39.099140 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.099033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:39.099140 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.099089 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:39.099140 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.099121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlgdb\" (UniqueName: \"kubernetes.io/projected/140bcda8-07de-4ffc-864e-bb4f6282940b-kube-api-access-tlgdb\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:39.099462 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.099439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:39.099462 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.099457 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:39.107379 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.107357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlgdb\" (UniqueName: \"kubernetes.io/projected/140bcda8-07de-4ffc-864e-bb4f6282940b-kube-api-access-tlgdb\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:39.125098 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.125077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:39.239536 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.239508 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z"] Apr 16 04:29:39.242634 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:29:39.242605 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod140bcda8_07de_4ffc_864e_bb4f6282940b.slice/crio-2a24dcb500334bb62e5054a2d5fa9e3143bc7749e52a08248f0c77e1d288a7c4 WatchSource:0}: Error finding container 2a24dcb500334bb62e5054a2d5fa9e3143bc7749e52a08248f0c77e1d288a7c4: Status 404 returned error can't find the container with id 2a24dcb500334bb62e5054a2d5fa9e3143bc7749e52a08248f0c77e1d288a7c4 Apr 16 04:29:39.537039 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.537004 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn"] Apr 16 04:29:39.540399 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.540378 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.543721 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.543703 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 04:29:39.543860 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.543703 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 04:29:39.543860 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.543735 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 04:29:39.543989 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.543979 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 04:29:39.544107 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.544091 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-zzbzl\"" Apr 16 04:29:39.559164 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.559140 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn"] Apr 16 04:29:39.704831 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.704785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b8d485b-b841-465a-b467-cd0eeb14836d-webhook-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-2szfn\" (UID: \"5b8d485b-b841-465a-b467-cd0eeb14836d\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.704831 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.704841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zhr2\" (UniqueName: \"kubernetes.io/projected/5b8d485b-b841-465a-b467-cd0eeb14836d-kube-api-access-8zhr2\") pod \"opendatahub-operator-controller-manager-c7946b447-2szfn\" (UID: \"5b8d485b-b841-465a-b467-cd0eeb14836d\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.705072 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.704867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b8d485b-b841-465a-b467-cd0eeb14836d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-2szfn\" (UID: \"5b8d485b-b841-465a-b467-cd0eeb14836d\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.805780 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.805697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b8d485b-b841-465a-b467-cd0eeb14836d-webhook-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-2szfn\" (UID: \"5b8d485b-b841-465a-b467-cd0eeb14836d\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.805780 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.805742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zhr2\" (UniqueName: \"kubernetes.io/projected/5b8d485b-b841-465a-b467-cd0eeb14836d-kube-api-access-8zhr2\") pod \"opendatahub-operator-controller-manager-c7946b447-2szfn\" (UID: \"5b8d485b-b841-465a-b467-cd0eeb14836d\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.805780 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.805761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b8d485b-b841-465a-b467-cd0eeb14836d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-2szfn\" (UID: \"5b8d485b-b841-465a-b467-cd0eeb14836d\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.808100 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.808079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b8d485b-b841-465a-b467-cd0eeb14836d-webhook-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-2szfn\" (UID: \"5b8d485b-b841-465a-b467-cd0eeb14836d\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.808251 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.808228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b8d485b-b841-465a-b467-cd0eeb14836d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-2szfn\" (UID: \"5b8d485b-b841-465a-b467-cd0eeb14836d\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.813907 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.813881 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zhr2\" (UniqueName: \"kubernetes.io/projected/5b8d485b-b841-465a-b467-cd0eeb14836d-kube-api-access-8zhr2\") pod \"opendatahub-operator-controller-manager-c7946b447-2szfn\" (UID: \"5b8d485b-b841-465a-b467-cd0eeb14836d\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.830786 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.830762 2578 generic.go:358] "Generic (PLEG): container finished" podID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerID="ec1b718491dd0235c3a5600e6e1f180c681f40cdcae5a608348485ef5dc85368" exitCode=0 Apr 16 04:29:39.830884 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.830841 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" event={"ID":"140bcda8-07de-4ffc-864e-bb4f6282940b","Type":"ContainerDied","Data":"ec1b718491dd0235c3a5600e6e1f180c681f40cdcae5a608348485ef5dc85368"} Apr 16 04:29:39.830884 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.830871 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" event={"ID":"140bcda8-07de-4ffc-864e-bb4f6282940b","Type":"ContainerStarted","Data":"2a24dcb500334bb62e5054a2d5fa9e3143bc7749e52a08248f0c77e1d288a7c4"} Apr 16 04:29:39.850335 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.850312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:39.966528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:39.966506 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn"] Apr 16 04:29:39.968852 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:29:39.968825 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b8d485b_b841_465a_b467_cd0eeb14836d.slice/crio-424a0328c52ff1c97def55992416273f90196de21e7c1f4bc2344c11e7dbdad7 WatchSource:0}: Error finding container 424a0328c52ff1c97def55992416273f90196de21e7c1f4bc2344c11e7dbdad7: Status 404 returned error can't find the container with id 424a0328c52ff1c97def55992416273f90196de21e7c1f4bc2344c11e7dbdad7 Apr 16 04:29:40.835138 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:40.835101 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" event={"ID":"5b8d485b-b841-465a-b467-cd0eeb14836d","Type":"ContainerStarted","Data":"424a0328c52ff1c97def55992416273f90196de21e7c1f4bc2344c11e7dbdad7"} Apr 16 04:29:40.837079 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:40.837050 2578 generic.go:358] "Generic (PLEG): container finished" podID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerID="645a4718493da29c8c413e681ef416c18b6398a1c8e02e3add825b08a25a9f39" exitCode=0 Apr 16 04:29:40.837225 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:40.837094 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" event={"ID":"140bcda8-07de-4ffc-864e-bb4f6282940b","Type":"ContainerDied","Data":"645a4718493da29c8c413e681ef416c18b6398a1c8e02e3add825b08a25a9f39"} Apr 16 04:29:41.841768 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.841736 2578 generic.go:358] "Generic (PLEG): container finished" podID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerID="1ffe529ba734fcf9a4453d4f4e18bab0baf894dd3ab1e8504044e42b394b0139" exitCode=0 Apr 16 04:29:41.842212 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.841820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" event={"ID":"140bcda8-07de-4ffc-864e-bb4f6282940b","Type":"ContainerDied","Data":"1ffe529ba734fcf9a4453d4f4e18bab0baf894dd3ab1e8504044e42b394b0139"} Apr 16 04:29:41.846447 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.846424 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc"] Apr 16 04:29:41.855037 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.854998 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:41.857736 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.857710 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 04:29:41.857844 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.857794 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc"] Apr 16 04:29:41.857927 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.857913 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-2xlxd\"" Apr 16 04:29:41.857996 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.857980 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 04:29:41.858130 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.858106 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 04:29:41.858992 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.858788 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:29:41.858992 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.858834 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 04:29:41.918677 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.918644 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5cc40a23-e80f-48be-9feb-bfc24b833509-manager-config\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:41.918832 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.918715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cc40a23-e80f-48be-9feb-bfc24b833509-metrics-cert\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:41.918832 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.918735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpdvd\" (UniqueName: \"kubernetes.io/projected/5cc40a23-e80f-48be-9feb-bfc24b833509-kube-api-access-bpdvd\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:41.918832 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:41.918766 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cc40a23-e80f-48be-9feb-bfc24b833509-cert\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.020031 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.019994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cc40a23-e80f-48be-9feb-bfc24b833509-metrics-cert\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.020198 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.020040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpdvd\" (UniqueName: \"kubernetes.io/projected/5cc40a23-e80f-48be-9feb-bfc24b833509-kube-api-access-bpdvd\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.020198 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.020151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cc40a23-e80f-48be-9feb-bfc24b833509-cert\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.020350 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.020223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5cc40a23-e80f-48be-9feb-bfc24b833509-manager-config\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.020918 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.020887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5cc40a23-e80f-48be-9feb-bfc24b833509-manager-config\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.022671 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.022645 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cc40a23-e80f-48be-9feb-bfc24b833509-metrics-cert\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.022815 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.022795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cc40a23-e80f-48be-9feb-bfc24b833509-cert\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.029846 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.029808 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpdvd\" (UniqueName: \"kubernetes.io/projected/5cc40a23-e80f-48be-9feb-bfc24b833509-kube-api-access-bpdvd\") pod \"lws-controller-manager-5988777b7d-ktcrc\" (UID: \"5cc40a23-e80f-48be-9feb-bfc24b833509\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.167762 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.167669 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:42.697824 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.697783 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc"] Apr 16 04:29:42.699160 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:29:42.699138 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cc40a23_e80f_48be_9feb_bfc24b833509.slice/crio-08a6eed8f691223b610e1717341fd0c6a507dd5f57237711cd71eb994f5af4da WatchSource:0}: Error finding container 08a6eed8f691223b610e1717341fd0c6a507dd5f57237711cd71eb994f5af4da: Status 404 returned error can't find the container with id 08a6eed8f691223b610e1717341fd0c6a507dd5f57237711cd71eb994f5af4da Apr 16 04:29:42.846846 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.846751 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" event={"ID":"5cc40a23-e80f-48be-9feb-bfc24b833509","Type":"ContainerStarted","Data":"08a6eed8f691223b610e1717341fd0c6a507dd5f57237711cd71eb994f5af4da"} Apr 16 04:29:42.848495 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.848463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" event={"ID":"5b8d485b-b841-465a-b467-cd0eeb14836d","Type":"ContainerStarted","Data":"3cc456be13e7421b7acf0d981b4d19736d3cbfbc4e8e8dd49936b2563110c166"} Apr 16 04:29:42.848628 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.848570 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:42.869408 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.869363 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" podStartSLOduration=1.2308724070000001 podStartE2EDuration="3.869347632s" podCreationTimestamp="2026-04-16 04:29:39 +0000 UTC" firstStartedPulling="2026-04-16 04:29:39.970647451 +0000 UTC m=+335.787187509" lastFinishedPulling="2026-04-16 04:29:42.609122662 +0000 UTC m=+338.425662734" observedRunningTime="2026-04-16 04:29:42.867716352 +0000 UTC m=+338.684256443" watchObservedRunningTime="2026-04-16 04:29:42.869347632 +0000 UTC m=+338.685887711" Apr 16 04:29:42.960009 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:42.959987 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:43.030126 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.030098 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlgdb\" (UniqueName: \"kubernetes.io/projected/140bcda8-07de-4ffc-864e-bb4f6282940b-kube-api-access-tlgdb\") pod \"140bcda8-07de-4ffc-864e-bb4f6282940b\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " Apr 16 04:29:43.030287 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.030204 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-bundle\") pod \"140bcda8-07de-4ffc-864e-bb4f6282940b\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " Apr 16 04:29:43.030287 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.030239 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-util\") pod \"140bcda8-07de-4ffc-864e-bb4f6282940b\" (UID: \"140bcda8-07de-4ffc-864e-bb4f6282940b\") " Apr 16 04:29:43.031223 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.031194 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-bundle" (OuterVolumeSpecName: "bundle") pod "140bcda8-07de-4ffc-864e-bb4f6282940b" (UID: "140bcda8-07de-4ffc-864e-bb4f6282940b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:29:43.032634 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.032598 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140bcda8-07de-4ffc-864e-bb4f6282940b-kube-api-access-tlgdb" (OuterVolumeSpecName: "kube-api-access-tlgdb") pod "140bcda8-07de-4ffc-864e-bb4f6282940b" (UID: "140bcda8-07de-4ffc-864e-bb4f6282940b"). InnerVolumeSpecName "kube-api-access-tlgdb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:29:43.035873 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.035758 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-util" (OuterVolumeSpecName: "util") pod "140bcda8-07de-4ffc-864e-bb4f6282940b" (UID: "140bcda8-07de-4ffc-864e-bb4f6282940b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:29:43.131315 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.131165 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-bundle\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:29:43.131315 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.131197 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/140bcda8-07de-4ffc-864e-bb4f6282940b-util\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:29:43.131315 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.131207 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tlgdb\" (UniqueName: \"kubernetes.io/projected/140bcda8-07de-4ffc-864e-bb4f6282940b-kube-api-access-tlgdb\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:29:43.854365 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.854332 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" Apr 16 04:29:43.854839 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.854327 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w7z8z" event={"ID":"140bcda8-07de-4ffc-864e-bb4f6282940b","Type":"ContainerDied","Data":"2a24dcb500334bb62e5054a2d5fa9e3143bc7749e52a08248f0c77e1d288a7c4"} Apr 16 04:29:43.854839 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:43.854477 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a24dcb500334bb62e5054a2d5fa9e3143bc7749e52a08248f0c77e1d288a7c4" Apr 16 04:29:45.862361 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:45.862327 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" event={"ID":"5cc40a23-e80f-48be-9feb-bfc24b833509","Type":"ContainerStarted","Data":"b6b26f41b6530e4aaad12792035b577487b4039248920e2bc8303bee29b027f0"} Apr 16 04:29:45.862702 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:45.862452 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:45.880130 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:45.880086 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" podStartSLOduration=2.423560518 podStartE2EDuration="4.880073361s" podCreationTimestamp="2026-04-16 04:29:41 +0000 UTC" firstStartedPulling="2026-04-16 04:29:42.700883423 +0000 UTC m=+338.517423488" lastFinishedPulling="2026-04-16 04:29:45.157396274 +0000 UTC m=+340.973936331" observedRunningTime="2026-04-16 04:29:45.879050555 +0000 UTC m=+341.695590649" watchObservedRunningTime="2026-04-16 04:29:45.880073361 +0000 UTC m=+341.696613492" Apr 16 04:29:53.856389 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:53.856358 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-2szfn" Apr 16 04:29:56.357119 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.357073 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw"] Apr 16 04:29:56.357545 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.357336 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerName="extract" Apr 16 04:29:56.357545 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.357352 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerName="extract" Apr 16 04:29:56.357545 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.357369 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerName="pull" Apr 16 04:29:56.357545 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.357375 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerName="pull" Apr 16 04:29:56.357545 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.357385 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerName="util" Apr 16 04:29:56.357545 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.357391 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerName="util" Apr 16 04:29:56.357545 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.357434 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="140bcda8-07de-4ffc-864e-bb4f6282940b" containerName="extract" Apr 16 04:29:56.361630 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.361610 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.364315 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.364275 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 04:29:56.364415 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.364277 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v68fx\"" Apr 16 04:29:56.365226 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.365205 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 04:29:56.370755 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.370733 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw"] Apr 16 04:29:56.428698 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.428664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.428873 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.428779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.428922 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.428868 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gtsz\" (UniqueName: \"kubernetes.io/projected/25420e37-0118-46a2-a489-2d8ec8d60216-kube-api-access-8gtsz\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.530285 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.530251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gtsz\" (UniqueName: \"kubernetes.io/projected/25420e37-0118-46a2-a489-2d8ec8d60216-kube-api-access-8gtsz\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.530497 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.530307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.530497 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.530350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.530767 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.530745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.530819 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.530761 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.538045 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.538015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gtsz\" (UniqueName: \"kubernetes.io/projected/25420e37-0118-46a2-a489-2d8ec8d60216-kube-api-access-8gtsz\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.671502 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.671404 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:29:56.791074 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.790998 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw"] Apr 16 04:29:56.793226 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:29:56.793197 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25420e37_0118_46a2_a489_2d8ec8d60216.slice/crio-6086e1e74e214ce7763c20d0da3ee59341b4104310b53193d6c5bf0214fc75f4 WatchSource:0}: Error finding container 6086e1e74e214ce7763c20d0da3ee59341b4104310b53193d6c5bf0214fc75f4: Status 404 returned error can't find the container with id 6086e1e74e214ce7763c20d0da3ee59341b4104310b53193d6c5bf0214fc75f4 Apr 16 04:29:56.867278 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.867256 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-ktcrc" Apr 16 04:29:56.894686 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.894640 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" event={"ID":"25420e37-0118-46a2-a489-2d8ec8d60216","Type":"ContainerStarted","Data":"0a937b9a255e1d8a02231b8fe9e838226e57e4b5282d67a8bdd79083d5b8bd1f"} Apr 16 04:29:56.894926 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:56.894697 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" event={"ID":"25420e37-0118-46a2-a489-2d8ec8d60216","Type":"ContainerStarted","Data":"6086e1e74e214ce7763c20d0da3ee59341b4104310b53193d6c5bf0214fc75f4"} Apr 16 04:29:57.108249 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.108214 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd"] Apr 16 04:29:57.111237 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.111216 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.113515 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.113496 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 04:29:57.113606 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.113537 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 04:29:57.113836 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.113809 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 04:29:57.113980 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.113960 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-5rvrm\"" Apr 16 04:29:57.114459 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.114423 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 04:29:57.121943 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.121920 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd"] Apr 16 04:29:57.235994 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.235954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a36c88de-8c4f-430f-8db6-fb29785a0264-tmp\") pod \"kube-auth-proxy-68d6cc647-pv6kd\" (UID: \"a36c88de-8c4f-430f-8db6-fb29785a0264\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.236156 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.236028 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmr4d\" (UniqueName: \"kubernetes.io/projected/a36c88de-8c4f-430f-8db6-fb29785a0264-kube-api-access-gmr4d\") pod \"kube-auth-proxy-68d6cc647-pv6kd\" (UID: \"a36c88de-8c4f-430f-8db6-fb29785a0264\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.236156 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.236063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a36c88de-8c4f-430f-8db6-fb29785a0264-tls-certs\") pod \"kube-auth-proxy-68d6cc647-pv6kd\" (UID: \"a36c88de-8c4f-430f-8db6-fb29785a0264\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.337046 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.337004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a36c88de-8c4f-430f-8db6-fb29785a0264-tmp\") pod \"kube-auth-proxy-68d6cc647-pv6kd\" (UID: \"a36c88de-8c4f-430f-8db6-fb29785a0264\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.337228 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.337086 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmr4d\" (UniqueName: \"kubernetes.io/projected/a36c88de-8c4f-430f-8db6-fb29785a0264-kube-api-access-gmr4d\") pod \"kube-auth-proxy-68d6cc647-pv6kd\" (UID: \"a36c88de-8c4f-430f-8db6-fb29785a0264\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.337228 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.337118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a36c88de-8c4f-430f-8db6-fb29785a0264-tls-certs\") pod \"kube-auth-proxy-68d6cc647-pv6kd\" (UID: \"a36c88de-8c4f-430f-8db6-fb29785a0264\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.339456 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.339421 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a36c88de-8c4f-430f-8db6-fb29785a0264-tmp\") pod \"kube-auth-proxy-68d6cc647-pv6kd\" (UID: \"a36c88de-8c4f-430f-8db6-fb29785a0264\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.339677 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.339656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a36c88de-8c4f-430f-8db6-fb29785a0264-tls-certs\") pod \"kube-auth-proxy-68d6cc647-pv6kd\" (UID: \"a36c88de-8c4f-430f-8db6-fb29785a0264\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.344884 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.344858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmr4d\" (UniqueName: \"kubernetes.io/projected/a36c88de-8c4f-430f-8db6-fb29785a0264-kube-api-access-gmr4d\") pod \"kube-auth-proxy-68d6cc647-pv6kd\" (UID: \"a36c88de-8c4f-430f-8db6-fb29785a0264\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.426814 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.426727 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" Apr 16 04:29:57.545469 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.545443 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd"] Apr 16 04:29:57.547919 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:29:57.547888 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda36c88de_8c4f_430f_8db6_fb29785a0264.slice/crio-e9387066ce5ff81f546ab597c179e49e016d4ace93900e5f6fc9c99e1c1b68d8 WatchSource:0}: Error finding container e9387066ce5ff81f546ab597c179e49e016d4ace93900e5f6fc9c99e1c1b68d8: Status 404 returned error can't find the container with id e9387066ce5ff81f546ab597c179e49e016d4ace93900e5f6fc9c99e1c1b68d8 Apr 16 04:29:57.899317 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.899264 2578 generic.go:358] "Generic (PLEG): container finished" podID="25420e37-0118-46a2-a489-2d8ec8d60216" containerID="0a937b9a255e1d8a02231b8fe9e838226e57e4b5282d67a8bdd79083d5b8bd1f" exitCode=0 Apr 16 04:29:57.899507 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.899353 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" event={"ID":"25420e37-0118-46a2-a489-2d8ec8d60216","Type":"ContainerDied","Data":"0a937b9a255e1d8a02231b8fe9e838226e57e4b5282d67a8bdd79083d5b8bd1f"} Apr 16 04:29:57.900467 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:57.900448 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" event={"ID":"a36c88de-8c4f-430f-8db6-fb29785a0264","Type":"ContainerStarted","Data":"e9387066ce5ff81f546ab597c179e49e016d4ace93900e5f6fc9c99e1c1b68d8"} Apr 16 04:29:59.908785 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:59.908752 2578 generic.go:358] "Generic (PLEG): container finished" podID="25420e37-0118-46a2-a489-2d8ec8d60216" containerID="a686ae1baf1a9ade0928ed180ad99fe8a032799aa8d9f4e2714b57b7f4c85206" exitCode=0 Apr 16 04:29:59.909134 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:29:59.908798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" event={"ID":"25420e37-0118-46a2-a489-2d8ec8d60216","Type":"ContainerDied","Data":"a686ae1baf1a9ade0928ed180ad99fe8a032799aa8d9f4e2714b57b7f4c85206"} Apr 16 04:30:00.913433 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:00.913402 2578 generic.go:358] "Generic (PLEG): container finished" podID="25420e37-0118-46a2-a489-2d8ec8d60216" containerID="63c714976113d5eb74c6190a4acc576862212b30c3e9fd5667e24ee6b132ac46" exitCode=0 Apr 16 04:30:00.913891 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:00.913488 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" event={"ID":"25420e37-0118-46a2-a489-2d8ec8d60216","Type":"ContainerDied","Data":"63c714976113d5eb74c6190a4acc576862212b30c3e9fd5667e24ee6b132ac46"} Apr 16 04:30:01.917669 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:01.917632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" event={"ID":"a36c88de-8c4f-430f-8db6-fb29785a0264","Type":"ContainerStarted","Data":"1431adb383b6283bbf026b53f868827dc9603e3bb30334281fba06f17a67cb4e"} Apr 16 04:30:01.934334 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:01.934235 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-68d6cc647-pv6kd" podStartSLOduration=1.5457766739999999 podStartE2EDuration="4.934214332s" podCreationTimestamp="2026-04-16 04:29:57 +0000 UTC" firstStartedPulling="2026-04-16 04:29:57.550012491 +0000 UTC m=+353.366552549" lastFinishedPulling="2026-04-16 04:30:00.938450131 +0000 UTC m=+356.754990207" observedRunningTime="2026-04-16 04:30:01.933627809 +0000 UTC m=+357.750167891" watchObservedRunningTime="2026-04-16 04:30:01.934214332 +0000 UTC m=+357.750754417" Apr 16 04:30:02.051089 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.051066 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:30:02.181493 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.181411 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-util\") pod \"25420e37-0118-46a2-a489-2d8ec8d60216\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " Apr 16 04:30:02.181493 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.181479 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gtsz\" (UniqueName: \"kubernetes.io/projected/25420e37-0118-46a2-a489-2d8ec8d60216-kube-api-access-8gtsz\") pod \"25420e37-0118-46a2-a489-2d8ec8d60216\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " Apr 16 04:30:02.181655 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.181512 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-bundle\") pod \"25420e37-0118-46a2-a489-2d8ec8d60216\" (UID: \"25420e37-0118-46a2-a489-2d8ec8d60216\") " Apr 16 04:30:02.182395 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.182366 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-bundle" (OuterVolumeSpecName: "bundle") pod "25420e37-0118-46a2-a489-2d8ec8d60216" (UID: "25420e37-0118-46a2-a489-2d8ec8d60216"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:30:02.183514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.183485 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25420e37-0118-46a2-a489-2d8ec8d60216-kube-api-access-8gtsz" (OuterVolumeSpecName: "kube-api-access-8gtsz") pod "25420e37-0118-46a2-a489-2d8ec8d60216" (UID: "25420e37-0118-46a2-a489-2d8ec8d60216"). InnerVolumeSpecName "kube-api-access-8gtsz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:30:02.185677 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.185636 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-util" (OuterVolumeSpecName: "util") pod "25420e37-0118-46a2-a489-2d8ec8d60216" (UID: "25420e37-0118-46a2-a489-2d8ec8d60216"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:30:02.283033 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.282991 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8gtsz\" (UniqueName: \"kubernetes.io/projected/25420e37-0118-46a2-a489-2d8ec8d60216-kube-api-access-8gtsz\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:30:02.283033 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.283027 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-bundle\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:30:02.283033 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.283042 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25420e37-0118-46a2-a489-2d8ec8d60216-util\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:30:02.922801 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.922765 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" event={"ID":"25420e37-0118-46a2-a489-2d8ec8d60216","Type":"ContainerDied","Data":"6086e1e74e214ce7763c20d0da3ee59341b4104310b53193d6c5bf0214fc75f4"} Apr 16 04:30:02.922801 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.922805 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6086e1e74e214ce7763c20d0da3ee59341b4104310b53193d6c5bf0214fc75f4" Apr 16 04:30:02.923212 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:30:02.922831 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mlhjw" Apr 16 04:31:49.863101 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.863061 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc"] Apr 16 04:31:49.863633 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.863417 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25420e37-0118-46a2-a489-2d8ec8d60216" containerName="pull" Apr 16 04:31:49.863633 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.863435 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="25420e37-0118-46a2-a489-2d8ec8d60216" containerName="pull" Apr 16 04:31:49.863633 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.863458 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25420e37-0118-46a2-a489-2d8ec8d60216" containerName="util" Apr 16 04:31:49.863633 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.863467 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="25420e37-0118-46a2-a489-2d8ec8d60216" containerName="util" Apr 16 04:31:49.863633 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.863483 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25420e37-0118-46a2-a489-2d8ec8d60216" containerName="extract" Apr 16 04:31:49.863633 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.863491 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="25420e37-0118-46a2-a489-2d8ec8d60216" containerName="extract" Apr 16 04:31:49.863633 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.863553 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="25420e37-0118-46a2-a489-2d8ec8d60216" containerName="extract" Apr 16 04:31:49.866404 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.866384 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:49.869004 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.868983 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 04:31:49.869114 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.869000 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 04:31:49.869114 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.869007 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 04:31:49.870236 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.870171 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5kqt9\"" Apr 16 04:31:49.870236 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.870191 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 04:31:49.872082 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.872062 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc"] Apr 16 04:31:49.963196 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.963166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/091e67ca-cd62-42b9-93f6-30f05c12506b-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:49.963196 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.963204 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgq9k\" (UniqueName: \"kubernetes.io/projected/091e67ca-cd62-42b9-93f6-30f05c12506b-kube-api-access-kgq9k\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:49.963444 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:49.963241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/091e67ca-cd62-42b9-93f6-30f05c12506b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:50.064632 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:50.064588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgq9k\" (UniqueName: \"kubernetes.io/projected/091e67ca-cd62-42b9-93f6-30f05c12506b-kube-api-access-kgq9k\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:50.064811 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:50.064650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/091e67ca-cd62-42b9-93f6-30f05c12506b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:50.064811 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:50.064745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/091e67ca-cd62-42b9-93f6-30f05c12506b-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:50.064922 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:31:50.064827 2578 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 16 04:31:50.064972 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:31:50.064920 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/091e67ca-cd62-42b9-93f6-30f05c12506b-plugin-serving-cert podName:091e67ca-cd62-42b9-93f6-30f05c12506b nodeName:}" failed. No retries permitted until 2026-04-16 04:31:50.564895194 +0000 UTC m=+466.381435260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/091e67ca-cd62-42b9-93f6-30f05c12506b-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-wqvvc" (UID: "091e67ca-cd62-42b9-93f6-30f05c12506b") : secret "plugin-serving-cert" not found Apr 16 04:31:50.065419 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:50.065400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/091e67ca-cd62-42b9-93f6-30f05c12506b-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:50.072828 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:50.072801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgq9k\" (UniqueName: \"kubernetes.io/projected/091e67ca-cd62-42b9-93f6-30f05c12506b-kube-api-access-kgq9k\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:50.568791 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:50.568757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/091e67ca-cd62-42b9-93f6-30f05c12506b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:50.571178 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:50.571149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/091e67ca-cd62-42b9-93f6-30f05c12506b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-wqvvc\" (UID: \"091e67ca-cd62-42b9-93f6-30f05c12506b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:50.776816 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:50.776781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" Apr 16 04:31:50.903990 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:50.903957 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc"] Apr 16 04:31:50.906439 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:31:50.906412 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod091e67ca_cd62_42b9_93f6_30f05c12506b.slice/crio-5a2f94f69158d6706a31bd7ebdaff8b91edc0de9c5a853d40bb3671bc77ac99b WatchSource:0}: Error finding container 5a2f94f69158d6706a31bd7ebdaff8b91edc0de9c5a853d40bb3671bc77ac99b: Status 404 returned error can't find the container with id 5a2f94f69158d6706a31bd7ebdaff8b91edc0de9c5a853d40bb3671bc77ac99b Apr 16 04:31:51.245099 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:31:51.245063 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" event={"ID":"091e67ca-cd62-42b9-93f6-30f05c12506b","Type":"ContainerStarted","Data":"5a2f94f69158d6706a31bd7ebdaff8b91edc0de9c5a853d40bb3671bc77ac99b"} Apr 16 04:32:14.330381 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:14.330339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" event={"ID":"091e67ca-cd62-42b9-93f6-30f05c12506b","Type":"ContainerStarted","Data":"101722b0873eb16a75ef759eca6daf74b5d6d5abf2a1e2b4c80ca82562d8063a"} Apr 16 04:32:14.346239 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:14.346184 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wqvvc" podStartSLOduration=2.374250251 podStartE2EDuration="25.346167696s" podCreationTimestamp="2026-04-16 04:31:49 +0000 UTC" firstStartedPulling="2026-04-16 04:31:50.907638319 +0000 UTC m=+466.724178377" lastFinishedPulling="2026-04-16 04:32:13.87955575 +0000 UTC m=+489.696095822" observedRunningTime="2026-04-16 04:32:14.34471849 +0000 UTC m=+490.161258569" watchObservedRunningTime="2026-04-16 04:32:14.346167696 +0000 UTC m=+490.162707802" Apr 16 04:32:36.596377 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:36.596343 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:32:36.819253 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:36.819212 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:32:36.821953 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:36.821932 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 04:32:36.823034 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:36.823013 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:32:36.823153 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:36.823042 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:32:36.927224 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:36.927146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shnzg\" (UniqueName: \"kubernetes.io/projected/5581d3a4-fa67-441a-8cb0-31637d096760-kube-api-access-shnzg\") pod \"limitador-limitador-78c99df468-79dg5\" (UID: \"5581d3a4-fa67-441a-8cb0-31637d096760\") " pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:32:36.927224 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:36.927183 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5581d3a4-fa67-441a-8cb0-31637d096760-config-file\") pod \"limitador-limitador-78c99df468-79dg5\" (UID: \"5581d3a4-fa67-441a-8cb0-31637d096760\") " pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:32:37.028282 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.028244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5581d3a4-fa67-441a-8cb0-31637d096760-config-file\") pod \"limitador-limitador-78c99df468-79dg5\" (UID: \"5581d3a4-fa67-441a-8cb0-31637d096760\") " pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:32:37.028456 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.028377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shnzg\" (UniqueName: \"kubernetes.io/projected/5581d3a4-fa67-441a-8cb0-31637d096760-kube-api-access-shnzg\") pod \"limitador-limitador-78c99df468-79dg5\" (UID: \"5581d3a4-fa67-441a-8cb0-31637d096760\") " pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:32:37.028968 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.028951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5581d3a4-fa67-441a-8cb0-31637d096760-config-file\") pod \"limitador-limitador-78c99df468-79dg5\" (UID: \"5581d3a4-fa67-441a-8cb0-31637d096760\") " pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:32:37.037798 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.037773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shnzg\" (UniqueName: \"kubernetes.io/projected/5581d3a4-fa67-441a-8cb0-31637d096760-kube-api-access-shnzg\") pod \"limitador-limitador-78c99df468-79dg5\" (UID: \"5581d3a4-fa67-441a-8cb0-31637d096760\") " pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:32:37.128791 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.128754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:32:37.187243 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.187169 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-6bkht"] Apr 16 04:32:37.241744 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.241707 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-6bkht"] Apr 16 04:32:37.241872 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.241772 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6bkht" Apr 16 04:32:37.244996 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.244960 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-w8nb4\"" Apr 16 04:32:37.246592 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.246575 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:32:37.248798 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:32:37.248775 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5581d3a4_fa67_441a_8cb0_31637d096760.slice/crio-996a86b8e2f81ff0ab07678ddf306111aa653b5416fd494d9f7492829c26e65a WatchSource:0}: Error finding container 996a86b8e2f81ff0ab07678ddf306111aa653b5416fd494d9f7492829c26e65a: Status 404 returned error can't find the container with id 996a86b8e2f81ff0ab07678ddf306111aa653b5416fd494d9f7492829c26e65a Apr 16 04:32:37.330337 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.330284 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl54w\" (UniqueName: \"kubernetes.io/projected/3f0929c5-a77d-461c-8795-3237345fd6f0-kube-api-access-jl54w\") pod \"authorino-7498df8756-6bkht\" (UID: \"3f0929c5-a77d-461c-8795-3237345fd6f0\") " pod="kuadrant-system/authorino-7498df8756-6bkht" Apr 16 04:32:37.396690 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.396659 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" event={"ID":"5581d3a4-fa67-441a-8cb0-31637d096760","Type":"ContainerStarted","Data":"996a86b8e2f81ff0ab07678ddf306111aa653b5416fd494d9f7492829c26e65a"} Apr 16 04:32:37.431208 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.431183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl54w\" (UniqueName: \"kubernetes.io/projected/3f0929c5-a77d-461c-8795-3237345fd6f0-kube-api-access-jl54w\") pod \"authorino-7498df8756-6bkht\" (UID: \"3f0929c5-a77d-461c-8795-3237345fd6f0\") " pod="kuadrant-system/authorino-7498df8756-6bkht" Apr 16 04:32:37.439383 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.439322 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl54w\" (UniqueName: \"kubernetes.io/projected/3f0929c5-a77d-461c-8795-3237345fd6f0-kube-api-access-jl54w\") pod \"authorino-7498df8756-6bkht\" (UID: \"3f0929c5-a77d-461c-8795-3237345fd6f0\") " pod="kuadrant-system/authorino-7498df8756-6bkht" Apr 16 04:32:37.558311 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.558271 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6bkht" Apr 16 04:32:37.672178 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:37.672150 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-6bkht"] Apr 16 04:32:37.674688 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:32:37.674663 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f0929c5_a77d_461c_8795_3237345fd6f0.slice/crio-528db26ef445e5842197196cc12397bc2acd2e1c8d546289cca380758c7305bd WatchSource:0}: Error finding container 528db26ef445e5842197196cc12397bc2acd2e1c8d546289cca380758c7305bd: Status 404 returned error can't find the container with id 528db26ef445e5842197196cc12397bc2acd2e1c8d546289cca380758c7305bd Apr 16 04:32:38.403761 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:38.403706 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6bkht" event={"ID":"3f0929c5-a77d-461c-8795-3237345fd6f0","Type":"ContainerStarted","Data":"528db26ef445e5842197196cc12397bc2acd2e1c8d546289cca380758c7305bd"} Apr 16 04:32:43.422653 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:43.422613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" event={"ID":"5581d3a4-fa67-441a-8cb0-31637d096760","Type":"ContainerStarted","Data":"04278058a7e22d935e88ab15826b12493385fc7c59cae91c6109913163f12333"} Apr 16 04:32:43.422653 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:43.422661 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:32:43.423824 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:43.423802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6bkht" event={"ID":"3f0929c5-a77d-461c-8795-3237345fd6f0","Type":"ContainerStarted","Data":"cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630"} Apr 16 04:32:43.438447 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:43.438395 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" podStartSLOduration=1.770149311 podStartE2EDuration="7.438379486s" podCreationTimestamp="2026-04-16 04:32:36 +0000 UTC" firstStartedPulling="2026-04-16 04:32:37.250454241 +0000 UTC m=+513.066994299" lastFinishedPulling="2026-04-16 04:32:42.918684414 +0000 UTC m=+518.735224474" observedRunningTime="2026-04-16 04:32:43.437735816 +0000 UTC m=+519.254275896" watchObservedRunningTime="2026-04-16 04:32:43.438379486 +0000 UTC m=+519.254919568" Apr 16 04:32:43.450720 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:43.450678 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-6bkht" podStartSLOduration=1.281561671 podStartE2EDuration="6.450645419s" podCreationTimestamp="2026-04-16 04:32:37 +0000 UTC" firstStartedPulling="2026-04-16 04:32:37.675955178 +0000 UTC m=+513.492495240" lastFinishedPulling="2026-04-16 04:32:42.84503893 +0000 UTC m=+518.661578988" observedRunningTime="2026-04-16 04:32:43.449476812 +0000 UTC m=+519.266016894" watchObservedRunningTime="2026-04-16 04:32:43.450645419 +0000 UTC m=+519.267185498" Apr 16 04:32:54.427675 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:32:54.427645 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-79dg5" Apr 16 04:34:04.712663 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:04.712628 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:34:04.713173 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:04.712832 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:34:04.726459 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:04.726434 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:34:05.007814 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.007782 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-95d87"] Apr 16 04:34:05.009798 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.009783 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-95d87" Apr 16 04:34:05.017201 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.017177 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-95d87"] Apr 16 04:34:05.081400 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.081367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6tzt\" (UniqueName: \"kubernetes.io/projected/6a23f264-97fc-450b-ae16-2a7310176757-kube-api-access-w6tzt\") pod \"authorino-8b475cf9f-95d87\" (UID: \"6a23f264-97fc-450b-ae16-2a7310176757\") " pod="kuadrant-system/authorino-8b475cf9f-95d87" Apr 16 04:34:05.182487 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.182459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6tzt\" (UniqueName: \"kubernetes.io/projected/6a23f264-97fc-450b-ae16-2a7310176757-kube-api-access-w6tzt\") pod \"authorino-8b475cf9f-95d87\" (UID: \"6a23f264-97fc-450b-ae16-2a7310176757\") " pod="kuadrant-system/authorino-8b475cf9f-95d87" Apr 16 04:34:05.205260 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.205228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6tzt\" (UniqueName: \"kubernetes.io/projected/6a23f264-97fc-450b-ae16-2a7310176757-kube-api-access-w6tzt\") pod \"authorino-8b475cf9f-95d87\" (UID: \"6a23f264-97fc-450b-ae16-2a7310176757\") " pod="kuadrant-system/authorino-8b475cf9f-95d87" Apr 16 04:34:05.255530 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.255495 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-95d87"] Apr 16 04:34:05.255732 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.255720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-95d87" Apr 16 04:34:05.280306 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.280266 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-659756f867-bvjfd"] Apr 16 04:34:05.283137 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.283105 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-659756f867-bvjfd" Apr 16 04:34:05.290041 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.290015 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-659756f867-bvjfd"] Apr 16 04:34:05.374991 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.374964 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-95d87"] Apr 16 04:34:05.376970 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:34:05.376941 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a23f264_97fc_450b_ae16_2a7310176757.slice/crio-104cf3dc99ab66f08f75f5e427632ad2f7bea46116fc6903d224e008617e7b42 WatchSource:0}: Error finding container 104cf3dc99ab66f08f75f5e427632ad2f7bea46116fc6903d224e008617e7b42: Status 404 returned error can't find the container with id 104cf3dc99ab66f08f75f5e427632ad2f7bea46116fc6903d224e008617e7b42 Apr 16 04:34:05.384332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.384290 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcwsr\" (UniqueName: \"kubernetes.io/projected/af4aac87-185c-49c3-960d-6ec1e41c3d48-kube-api-access-kcwsr\") pod \"authorino-659756f867-bvjfd\" (UID: \"af4aac87-185c-49c3-960d-6ec1e41c3d48\") " pod="kuadrant-system/authorino-659756f867-bvjfd" Apr 16 04:34:05.485197 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.485170 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcwsr\" (UniqueName: \"kubernetes.io/projected/af4aac87-185c-49c3-960d-6ec1e41c3d48-kube-api-access-kcwsr\") pod \"authorino-659756f867-bvjfd\" (UID: \"af4aac87-185c-49c3-960d-6ec1e41c3d48\") " pod="kuadrant-system/authorino-659756f867-bvjfd" Apr 16 04:34:05.492713 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.492685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcwsr\" (UniqueName: \"kubernetes.io/projected/af4aac87-185c-49c3-960d-6ec1e41c3d48-kube-api-access-kcwsr\") pod \"authorino-659756f867-bvjfd\" (UID: \"af4aac87-185c-49c3-960d-6ec1e41c3d48\") " pod="kuadrant-system/authorino-659756f867-bvjfd" Apr 16 04:34:05.593175 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.593101 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-659756f867-bvjfd"] Apr 16 04:34:05.593317 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.593306 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-659756f867-bvjfd" Apr 16 04:34:05.675649 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.675605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-95d87" event={"ID":"6a23f264-97fc-450b-ae16-2a7310176757","Type":"ContainerStarted","Data":"104cf3dc99ab66f08f75f5e427632ad2f7bea46116fc6903d224e008617e7b42"} Apr 16 04:34:05.706684 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:05.706653 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-659756f867-bvjfd"] Apr 16 04:34:05.709581 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:34:05.709558 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf4aac87_185c_49c3_960d_6ec1e41c3d48.slice/crio-c471ca533c6b13f49886bfd9f855c2734a4157095fd0588671300908c54c9a40 WatchSource:0}: Error finding container c471ca533c6b13f49886bfd9f855c2734a4157095fd0588671300908c54c9a40: Status 404 returned error can't find the container with id c471ca533c6b13f49886bfd9f855c2734a4157095fd0588671300908c54c9a40 Apr 16 04:34:06.681085 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.681048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-95d87" event={"ID":"6a23f264-97fc-450b-ae16-2a7310176757","Type":"ContainerStarted","Data":"2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef"} Apr 16 04:34:06.681580 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.681139 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-95d87" podUID="6a23f264-97fc-450b-ae16-2a7310176757" containerName="authorino" containerID="cri-o://2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef" gracePeriod=30 Apr 16 04:34:06.682436 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.682410 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-659756f867-bvjfd" event={"ID":"af4aac87-185c-49c3-960d-6ec1e41c3d48","Type":"ContainerStarted","Data":"d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d"} Apr 16 04:34:06.682543 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.682441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-659756f867-bvjfd" event={"ID":"af4aac87-185c-49c3-960d-6ec1e41c3d48","Type":"ContainerStarted","Data":"c471ca533c6b13f49886bfd9f855c2734a4157095fd0588671300908c54c9a40"} Apr 16 04:34:06.682543 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.682467 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-659756f867-bvjfd" podUID="af4aac87-185c-49c3-960d-6ec1e41c3d48" containerName="authorino" containerID="cri-o://d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d" gracePeriod=30 Apr 16 04:34:06.694927 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.694875 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-95d87" podStartSLOduration=2.337002227 podStartE2EDuration="2.694862843s" podCreationTimestamp="2026-04-16 04:34:04 +0000 UTC" firstStartedPulling="2026-04-16 04:34:05.37826133 +0000 UTC m=+601.194801387" lastFinishedPulling="2026-04-16 04:34:05.736121933 +0000 UTC m=+601.552662003" observedRunningTime="2026-04-16 04:34:06.694024252 +0000 UTC m=+602.510564331" watchObservedRunningTime="2026-04-16 04:34:06.694862843 +0000 UTC m=+602.511402923" Apr 16 04:34:06.707206 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.707164 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-659756f867-bvjfd" podStartSLOduration=1.303919284 podStartE2EDuration="1.707151296s" podCreationTimestamp="2026-04-16 04:34:05 +0000 UTC" firstStartedPulling="2026-04-16 04:34:05.710793099 +0000 UTC m=+601.527333160" lastFinishedPulling="2026-04-16 04:34:06.114025113 +0000 UTC m=+601.930565172" observedRunningTime="2026-04-16 04:34:06.706115555 +0000 UTC m=+602.522655638" watchObservedRunningTime="2026-04-16 04:34:06.707151296 +0000 UTC m=+602.523691431" Apr 16 04:34:06.938733 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.938712 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-659756f867-bvjfd" Apr 16 04:34:06.941773 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.941756 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-95d87" Apr 16 04:34:06.997934 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.997909 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcwsr\" (UniqueName: \"kubernetes.io/projected/af4aac87-185c-49c3-960d-6ec1e41c3d48-kube-api-access-kcwsr\") pod \"af4aac87-185c-49c3-960d-6ec1e41c3d48\" (UID: \"af4aac87-185c-49c3-960d-6ec1e41c3d48\") " Apr 16 04:34:06.998093 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:06.997969 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6tzt\" (UniqueName: \"kubernetes.io/projected/6a23f264-97fc-450b-ae16-2a7310176757-kube-api-access-w6tzt\") pod \"6a23f264-97fc-450b-ae16-2a7310176757\" (UID: \"6a23f264-97fc-450b-ae16-2a7310176757\") " Apr 16 04:34:07.000055 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.000023 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a23f264-97fc-450b-ae16-2a7310176757-kube-api-access-w6tzt" (OuterVolumeSpecName: "kube-api-access-w6tzt") pod "6a23f264-97fc-450b-ae16-2a7310176757" (UID: "6a23f264-97fc-450b-ae16-2a7310176757"). InnerVolumeSpecName "kube-api-access-w6tzt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:34:07.000156 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.000056 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af4aac87-185c-49c3-960d-6ec1e41c3d48-kube-api-access-kcwsr" (OuterVolumeSpecName: "kube-api-access-kcwsr") pod "af4aac87-185c-49c3-960d-6ec1e41c3d48" (UID: "af4aac87-185c-49c3-960d-6ec1e41c3d48"). InnerVolumeSpecName "kube-api-access-kcwsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:34:07.099426 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.099398 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w6tzt\" (UniqueName: \"kubernetes.io/projected/6a23f264-97fc-450b-ae16-2a7310176757-kube-api-access-w6tzt\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:34:07.099426 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.099423 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kcwsr\" (UniqueName: \"kubernetes.io/projected/af4aac87-185c-49c3-960d-6ec1e41c3d48-kube-api-access-kcwsr\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:34:07.687458 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.687424 2578 generic.go:358] "Generic (PLEG): container finished" podID="6a23f264-97fc-450b-ae16-2a7310176757" containerID="2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef" exitCode=0 Apr 16 04:34:07.687899 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.687473 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-95d87" Apr 16 04:34:07.687899 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.687497 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-95d87" event={"ID":"6a23f264-97fc-450b-ae16-2a7310176757","Type":"ContainerDied","Data":"2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef"} Apr 16 04:34:07.687899 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.687530 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-95d87" event={"ID":"6a23f264-97fc-450b-ae16-2a7310176757","Type":"ContainerDied","Data":"104cf3dc99ab66f08f75f5e427632ad2f7bea46116fc6903d224e008617e7b42"} Apr 16 04:34:07.687899 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.687550 2578 scope.go:117] "RemoveContainer" containerID="2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef" Apr 16 04:34:07.688847 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.688739 2578 generic.go:358] "Generic (PLEG): container finished" podID="af4aac87-185c-49c3-960d-6ec1e41c3d48" containerID="d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d" exitCode=0 Apr 16 04:34:07.688847 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.688786 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-659756f867-bvjfd" Apr 16 04:34:07.688847 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.688812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-659756f867-bvjfd" event={"ID":"af4aac87-185c-49c3-960d-6ec1e41c3d48","Type":"ContainerDied","Data":"d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d"} Apr 16 04:34:07.688847 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.688833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-659756f867-bvjfd" event={"ID":"af4aac87-185c-49c3-960d-6ec1e41c3d48","Type":"ContainerDied","Data":"c471ca533c6b13f49886bfd9f855c2734a4157095fd0588671300908c54c9a40"} Apr 16 04:34:07.695659 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.695639 2578 scope.go:117] "RemoveContainer" containerID="2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef" Apr 16 04:34:07.695906 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:34:07.695887 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef\": container with ID starting with 2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef not found: ID does not exist" containerID="2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef" Apr 16 04:34:07.695982 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.695921 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef"} err="failed to get container status \"2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef\": rpc error: code = NotFound desc = could not find container \"2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef\": container with ID starting with 2d4f8550b5bcb71a30149ff17a14df33a50f15a906d669b79550c6ac98b239ef not found: ID does not exist" Apr 16 04:34:07.695982 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.695946 2578 scope.go:117] "RemoveContainer" containerID="d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d" Apr 16 04:34:07.703337 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.703285 2578 scope.go:117] "RemoveContainer" containerID="d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d" Apr 16 04:34:07.703609 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:34:07.703591 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d\": container with ID starting with d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d not found: ID does not exist" containerID="d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d" Apr 16 04:34:07.703660 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.703616 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d"} err="failed to get container status \"d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d\": rpc error: code = NotFound desc = could not find container \"d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d\": container with ID starting with d6741bef49ca6c2761203128e49c80333d79108e216289f38a50c68bae87303d not found: ID does not exist" Apr 16 04:34:07.710883 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.710861 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-95d87"] Apr 16 04:34:07.714125 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.714106 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-95d87"] Apr 16 04:34:07.723026 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.723007 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-659756f867-bvjfd"] Apr 16 04:34:07.726166 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:07.726148 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-659756f867-bvjfd"] Apr 16 04:34:08.790333 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:08.790284 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a23f264-97fc-450b-ae16-2a7310176757" path="/var/lib/kubelet/pods/6a23f264-97fc-450b-ae16-2a7310176757/volumes" Apr 16 04:34:08.790690 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:08.790622 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af4aac87-185c-49c3-960d-6ec1e41c3d48" path="/var/lib/kubelet/pods/af4aac87-185c-49c3-960d-6ec1e41c3d48/volumes" Apr 16 04:34:09.015127 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.015095 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-6bkht"] Apr 16 04:34:09.015336 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.015313 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-6bkht" podUID="3f0929c5-a77d-461c-8795-3237345fd6f0" containerName="authorino" containerID="cri-o://cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630" gracePeriod=30 Apr 16 04:34:09.248265 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.248242 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6bkht" Apr 16 04:34:09.313534 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.313501 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl54w\" (UniqueName: \"kubernetes.io/projected/3f0929c5-a77d-461c-8795-3237345fd6f0-kube-api-access-jl54w\") pod \"3f0929c5-a77d-461c-8795-3237345fd6f0\" (UID: \"3f0929c5-a77d-461c-8795-3237345fd6f0\") " Apr 16 04:34:09.315536 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.315509 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f0929c5-a77d-461c-8795-3237345fd6f0-kube-api-access-jl54w" (OuterVolumeSpecName: "kube-api-access-jl54w") pod "3f0929c5-a77d-461c-8795-3237345fd6f0" (UID: "3f0929c5-a77d-461c-8795-3237345fd6f0"). InnerVolumeSpecName "kube-api-access-jl54w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:34:09.414231 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.414161 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jl54w\" (UniqueName: \"kubernetes.io/projected/3f0929c5-a77d-461c-8795-3237345fd6f0-kube-api-access-jl54w\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:34:09.698758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.698671 2578 generic.go:358] "Generic (PLEG): container finished" podID="3f0929c5-a77d-461c-8795-3237345fd6f0" containerID="cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630" exitCode=0 Apr 16 04:34:09.698758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.698717 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6bkht" Apr 16 04:34:09.698758 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.698736 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6bkht" event={"ID":"3f0929c5-a77d-461c-8795-3237345fd6f0","Type":"ContainerDied","Data":"cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630"} Apr 16 04:34:09.699019 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.698771 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6bkht" event={"ID":"3f0929c5-a77d-461c-8795-3237345fd6f0","Type":"ContainerDied","Data":"528db26ef445e5842197196cc12397bc2acd2e1c8d546289cca380758c7305bd"} Apr 16 04:34:09.699019 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.698791 2578 scope.go:117] "RemoveContainer" containerID="cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630" Apr 16 04:34:09.707193 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.707178 2578 scope.go:117] "RemoveContainer" containerID="cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630" Apr 16 04:34:09.707553 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:34:09.707528 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630\": container with ID starting with cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630 not found: ID does not exist" containerID="cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630" Apr 16 04:34:09.707652 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.707558 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630"} err="failed to get container status \"cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630\": rpc error: code = NotFound desc = could not find container \"cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630\": container with ID starting with cd224d43ceebf982644f24a7f68c2417667579827a7b95d5ea36e29c662dd630 not found: ID does not exist" Apr 16 04:34:09.721420 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.721399 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-6bkht"] Apr 16 04:34:09.726320 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:09.726287 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-6bkht"] Apr 16 04:34:10.790924 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:34:10.790883 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f0929c5-a77d-461c-8795-3237345fd6f0" path="/var/lib/kubelet/pods/3f0929c5-a77d-461c-8795-3237345fd6f0/volumes" Apr 16 04:35:12.233089 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233054 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-64946858d5-jqh7z"] Apr 16 04:35:12.233520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233316 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f0929c5-a77d-461c-8795-3237345fd6f0" containerName="authorino" Apr 16 04:35:12.233520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233330 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0929c5-a77d-461c-8795-3237345fd6f0" containerName="authorino" Apr 16 04:35:12.233520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233338 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a23f264-97fc-450b-ae16-2a7310176757" containerName="authorino" Apr 16 04:35:12.233520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233344 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a23f264-97fc-450b-ae16-2a7310176757" containerName="authorino" Apr 16 04:35:12.233520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233352 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af4aac87-185c-49c3-960d-6ec1e41c3d48" containerName="authorino" Apr 16 04:35:12.233520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233358 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4aac87-185c-49c3-960d-6ec1e41c3d48" containerName="authorino" Apr 16 04:35:12.233520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233409 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a23f264-97fc-450b-ae16-2a7310176757" containerName="authorino" Apr 16 04:35:12.233520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233416 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="af4aac87-185c-49c3-960d-6ec1e41c3d48" containerName="authorino" Apr 16 04:35:12.233520 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.233425 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f0929c5-a77d-461c-8795-3237345fd6f0" containerName="authorino" Apr 16 04:35:12.236472 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.236457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.240246 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.240220 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 16 04:35:12.240403 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.240309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 04:35:12.240403 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.240353 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-w8nb4\"" Apr 16 04:35:12.241796 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.241772 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-64946858d5-jqh7z"] Apr 16 04:35:12.386238 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.386207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-tls-cert\") pod \"authorino-64946858d5-jqh7z\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.386439 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.386246 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-oidc-ca\") pod \"authorino-64946858d5-jqh7z\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.386439 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.386270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7lrh\" (UniqueName: \"kubernetes.io/projected/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-kube-api-access-v7lrh\") pod \"authorino-64946858d5-jqh7z\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.487290 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.487210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-oidc-ca\") pod \"authorino-64946858d5-jqh7z\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.487290 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.487253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7lrh\" (UniqueName: \"kubernetes.io/projected/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-kube-api-access-v7lrh\") pod \"authorino-64946858d5-jqh7z\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.487477 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.487348 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-tls-cert\") pod \"authorino-64946858d5-jqh7z\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.487892 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.487868 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-oidc-ca\") pod \"authorino-64946858d5-jqh7z\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.489751 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.489730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-tls-cert\") pod \"authorino-64946858d5-jqh7z\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.494483 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.494459 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7lrh\" (UniqueName: \"kubernetes.io/projected/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-kube-api-access-v7lrh\") pod \"authorino-64946858d5-jqh7z\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.546274 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.546251 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:35:12.657754 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.657730 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-64946858d5-jqh7z"] Apr 16 04:35:12.660190 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:35:12.660161 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb074e8aa_5270_4d9f_a612_d6e3c2f5aa12.slice/crio-c4ef6a41dbae11e0fa61bd8bee93bbfc600fa4dc7fb64829a4d1dfae35d48349 WatchSource:0}: Error finding container c4ef6a41dbae11e0fa61bd8bee93bbfc600fa4dc7fb64829a4d1dfae35d48349: Status 404 returned error can't find the container with id c4ef6a41dbae11e0fa61bd8bee93bbfc600fa4dc7fb64829a4d1dfae35d48349 Apr 16 04:35:12.661817 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.661797 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:35:12.893370 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:12.893273 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-64946858d5-jqh7z" event={"ID":"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12","Type":"ContainerStarted","Data":"c4ef6a41dbae11e0fa61bd8bee93bbfc600fa4dc7fb64829a4d1dfae35d48349"} Apr 16 04:35:13.897871 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:13.897838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-64946858d5-jqh7z" event={"ID":"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12","Type":"ContainerStarted","Data":"99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902"} Apr 16 04:35:13.911751 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:13.911711 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-64946858d5-jqh7z" podStartSLOduration=1.472391208 podStartE2EDuration="1.911696879s" podCreationTimestamp="2026-04-16 04:35:12 +0000 UTC" firstStartedPulling="2026-04-16 04:35:12.661955369 +0000 UTC m=+668.478495427" lastFinishedPulling="2026-04-16 04:35:13.101261041 +0000 UTC m=+668.917801098" observedRunningTime="2026-04-16 04:35:13.910486477 +0000 UTC m=+669.727026558" watchObservedRunningTime="2026-04-16 04:35:13.911696879 +0000 UTC m=+669.728236959" Apr 16 04:35:17.943315 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:17.943269 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:35:36.489360 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:36.489327 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:35:39.992603 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:39.992569 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:35:51.584069 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:35:51.583693 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:36:14.681209 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:14.681175 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:36:36.468182 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.468145 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-687c85d8d4-pvm7p"] Apr 16 04:36:36.471425 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.471404 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.477674 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.477649 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-687c85d8d4-pvm7p"] Apr 16 04:36:36.612056 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.612015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e885594c-0409-4740-b6e1-6683bb838e2b-tls-cert\") pod \"authorino-687c85d8d4-pvm7p\" (UID: \"e885594c-0409-4740-b6e1-6683bb838e2b\") " pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.612246 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.612073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/e885594c-0409-4740-b6e1-6683bb838e2b-oidc-ca\") pod \"authorino-687c85d8d4-pvm7p\" (UID: \"e885594c-0409-4740-b6e1-6683bb838e2b\") " pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.612246 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.612147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cnxb\" (UniqueName: \"kubernetes.io/projected/e885594c-0409-4740-b6e1-6683bb838e2b-kube-api-access-9cnxb\") pod \"authorino-687c85d8d4-pvm7p\" (UID: \"e885594c-0409-4740-b6e1-6683bb838e2b\") " pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.712864 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.712830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cnxb\" (UniqueName: \"kubernetes.io/projected/e885594c-0409-4740-b6e1-6683bb838e2b-kube-api-access-9cnxb\") pod \"authorino-687c85d8d4-pvm7p\" (UID: \"e885594c-0409-4740-b6e1-6683bb838e2b\") " pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.713047 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.712875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e885594c-0409-4740-b6e1-6683bb838e2b-tls-cert\") pod \"authorino-687c85d8d4-pvm7p\" (UID: \"e885594c-0409-4740-b6e1-6683bb838e2b\") " pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.713047 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.712913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/e885594c-0409-4740-b6e1-6683bb838e2b-oidc-ca\") pod \"authorino-687c85d8d4-pvm7p\" (UID: \"e885594c-0409-4740-b6e1-6683bb838e2b\") " pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.713503 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.713484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/e885594c-0409-4740-b6e1-6683bb838e2b-oidc-ca\") pod \"authorino-687c85d8d4-pvm7p\" (UID: \"e885594c-0409-4740-b6e1-6683bb838e2b\") " pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.715356 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.715334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e885594c-0409-4740-b6e1-6683bb838e2b-tls-cert\") pod \"authorino-687c85d8d4-pvm7p\" (UID: \"e885594c-0409-4740-b6e1-6683bb838e2b\") " pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.719678 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.719623 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cnxb\" (UniqueName: \"kubernetes.io/projected/e885594c-0409-4740-b6e1-6683bb838e2b-kube-api-access-9cnxb\") pod \"authorino-687c85d8d4-pvm7p\" (UID: \"e885594c-0409-4740-b6e1-6683bb838e2b\") " pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.781107 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.781078 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-687c85d8d4-pvm7p" Apr 16 04:36:36.897260 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:36.897234 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-687c85d8d4-pvm7p"] Apr 16 04:36:36.899681 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:36:36.899654 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode885594c_0409_4740_b6e1_6683bb838e2b.slice/crio-092fef17b357da3aeb58f900030010f770b9ec3c9420c9349f47253c795b1915 WatchSource:0}: Error finding container 092fef17b357da3aeb58f900030010f770b9ec3c9420c9349f47253c795b1915: Status 404 returned error can't find the container with id 092fef17b357da3aeb58f900030010f770b9ec3c9420c9349f47253c795b1915 Apr 16 04:36:37.174968 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:37.174932 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-687c85d8d4-pvm7p" event={"ID":"e885594c-0409-4740-b6e1-6683bb838e2b","Type":"ContainerStarted","Data":"092fef17b357da3aeb58f900030010f770b9ec3c9420c9349f47253c795b1915"} Apr 16 04:36:38.179198 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.179160 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-687c85d8d4-pvm7p" event={"ID":"e885594c-0409-4740-b6e1-6683bb838e2b","Type":"ContainerStarted","Data":"1279db20923e7d3a927677bee0fbfbd212378767e82308903dfef3327fb530bd"} Apr 16 04:36:38.194714 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.194673 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-687c85d8d4-pvm7p" podStartSLOduration=1.802050538 podStartE2EDuration="2.194659556s" podCreationTimestamp="2026-04-16 04:36:36 +0000 UTC" firstStartedPulling="2026-04-16 04:36:36.900858803 +0000 UTC m=+752.717398860" lastFinishedPulling="2026-04-16 04:36:37.293467815 +0000 UTC m=+753.110007878" observedRunningTime="2026-04-16 04:36:38.192598754 +0000 UTC m=+754.009138833" watchObservedRunningTime="2026-04-16 04:36:38.194659556 +0000 UTC m=+754.011199683" Apr 16 04:36:38.214776 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.214745 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-64946858d5-jqh7z"] Apr 16 04:36:38.214974 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.214953 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-64946858d5-jqh7z" podUID="b074e8aa-5270-4d9f-a612-d6e3c2f5aa12" containerName="authorino" containerID="cri-o://99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902" gracePeriod=30 Apr 16 04:36:38.450742 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.450719 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:36:38.629365 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.629315 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-oidc-ca\") pod \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " Apr 16 04:36:38.629534 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.629388 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7lrh\" (UniqueName: \"kubernetes.io/projected/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-kube-api-access-v7lrh\") pod \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " Apr 16 04:36:38.629534 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.629429 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-tls-cert\") pod \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\" (UID: \"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12\") " Apr 16 04:36:38.631398 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.631366 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-kube-api-access-v7lrh" (OuterVolumeSpecName: "kube-api-access-v7lrh") pod "b074e8aa-5270-4d9f-a612-d6e3c2f5aa12" (UID: "b074e8aa-5270-4d9f-a612-d6e3c2f5aa12"). InnerVolumeSpecName "kube-api-access-v7lrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:36:38.633933 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.633911 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "b074e8aa-5270-4d9f-a612-d6e3c2f5aa12" (UID: "b074e8aa-5270-4d9f-a612-d6e3c2f5aa12"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:36:38.639398 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.639372 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "b074e8aa-5270-4d9f-a612-d6e3c2f5aa12" (UID: "b074e8aa-5270-4d9f-a612-d6e3c2f5aa12"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:36:38.729848 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.729825 2578 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-oidc-ca\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:36:38.729938 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.729850 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v7lrh\" (UniqueName: \"kubernetes.io/projected/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-kube-api-access-v7lrh\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:36:38.729938 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:38.729859 2578 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12-tls-cert\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:36:39.183153 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:39.183116 2578 generic.go:358] "Generic (PLEG): container finished" podID="b074e8aa-5270-4d9f-a612-d6e3c2f5aa12" containerID="99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902" exitCode=0 Apr 16 04:36:39.183614 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:39.183183 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-64946858d5-jqh7z" Apr 16 04:36:39.183614 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:39.183205 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-64946858d5-jqh7z" event={"ID":"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12","Type":"ContainerDied","Data":"99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902"} Apr 16 04:36:39.183614 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:39.183246 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-64946858d5-jqh7z" event={"ID":"b074e8aa-5270-4d9f-a612-d6e3c2f5aa12","Type":"ContainerDied","Data":"c4ef6a41dbae11e0fa61bd8bee93bbfc600fa4dc7fb64829a4d1dfae35d48349"} Apr 16 04:36:39.183614 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:39.183261 2578 scope.go:117] "RemoveContainer" containerID="99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902" Apr 16 04:36:39.190953 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:39.190935 2578 scope.go:117] "RemoveContainer" containerID="99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902" Apr 16 04:36:39.191172 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:36:39.191154 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902\": container with ID starting with 99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902 not found: ID does not exist" containerID="99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902" Apr 16 04:36:39.191221 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:39.191181 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902"} err="failed to get container status \"99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902\": rpc error: code = NotFound desc = could not find container \"99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902\": container with ID starting with 99a1dd21e169d61f8af1efdfc6fff57e953189bf35be04c0b8cf4193838f8902 not found: ID does not exist" Apr 16 04:36:39.197875 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:39.197855 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-64946858d5-jqh7z"] Apr 16 04:36:39.201428 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:39.201410 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-64946858d5-jqh7z"] Apr 16 04:36:40.790841 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:40.790811 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b074e8aa-5270-4d9f-a612-d6e3c2f5aa12" path="/var/lib/kubelet/pods/b074e8aa-5270-4d9f-a612-d6e3c2f5aa12/volumes" Apr 16 04:36:59.781810 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:36:59.781776 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:37:07.884775 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:37:07.884735 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:37:38.586281 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:37:38.586197 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:37:54.800164 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:37:54.800132 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:38:33.488233 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:38:33.488197 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:38:50.480680 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:38:50.480642 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:39:04.493470 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:39:04.493428 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:39:04.733535 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:39:04.733506 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:39:04.734094 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:39:04.734074 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:39:20.480774 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:39:20.480739 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:40:13.791149 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:40:13.791059 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:40:23.184910 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:40:23.184878 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:40:39.780332 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:40:39.780282 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:40:48.189914 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:40:48.189877 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:41:05.184398 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:41:05.184360 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:41:13.587631 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:41:13.587592 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:41:46.290836 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:41:46.290754 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:41:54.481499 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:41:54.481465 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:42:02.384689 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:42:02.384651 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:42:10.979799 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:42:10.979767 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:42:19.193811 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:42:19.193774 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:42:36.583252 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:42:36.583204 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:42:46.996053 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:42:46.996017 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:43:34.600678 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:43:34.600587 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:43:42.665796 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:43:42.662530 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:43:51.696666 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:43:51.696632 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:43:59.703061 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:43:59.703023 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:44:04.759168 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:44:04.759144 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:44:04.761186 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:44:04.761164 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:44:09.748915 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:44:09.748881 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:44:17.787025 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:44:17.786987 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:44:27.005444 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:44:27.005414 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:44:34.982867 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:44:34.982840 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:44:44.990778 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:44:44.990701 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:44:53.487663 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:44:53.487616 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:45:00.157050 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.157013 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29605245-gtv5w"] Apr 16 04:45:00.157446 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.157257 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b074e8aa-5270-4d9f-a612-d6e3c2f5aa12" containerName="authorino" Apr 16 04:45:00.157446 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.157268 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b074e8aa-5270-4d9f-a612-d6e3c2f5aa12" containerName="authorino" Apr 16 04:45:00.157446 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.157353 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b074e8aa-5270-4d9f-a612-d6e3c2f5aa12" containerName="authorino" Apr 16 04:45:00.160017 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.160000 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" Apr 16 04:45:00.163086 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.163066 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-h2mqr\"" Apr 16 04:45:00.187462 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.187432 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605245-gtv5w"] Apr 16 04:45:00.235663 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.235638 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtv2c\" (UniqueName: \"kubernetes.io/projected/2baf5da4-0252-4a03-9446-21f83fb88151-kube-api-access-rtv2c\") pod \"maas-api-key-cleanup-29605245-gtv5w\" (UID: \"2baf5da4-0252-4a03-9446-21f83fb88151\") " pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" Apr 16 04:45:00.336937 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.336906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtv2c\" (UniqueName: \"kubernetes.io/projected/2baf5da4-0252-4a03-9446-21f83fb88151-kube-api-access-rtv2c\") pod \"maas-api-key-cleanup-29605245-gtv5w\" (UID: \"2baf5da4-0252-4a03-9446-21f83fb88151\") " pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" Apr 16 04:45:00.350809 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.350779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtv2c\" (UniqueName: \"kubernetes.io/projected/2baf5da4-0252-4a03-9446-21f83fb88151-kube-api-access-rtv2c\") pod \"maas-api-key-cleanup-29605245-gtv5w\" (UID: \"2baf5da4-0252-4a03-9446-21f83fb88151\") " pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" Apr 16 04:45:00.469594 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.469509 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" Apr 16 04:45:00.587321 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.587278 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605245-gtv5w"] Apr 16 04:45:00.589595 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:45:00.589566 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2baf5da4_0252_4a03_9446_21f83fb88151.slice/crio-0bd85557138087f4b3e7e73eb6708d6abfb410ae9f7cc6c10f76a9bdaf3bbe4a WatchSource:0}: Error finding container 0bd85557138087f4b3e7e73eb6708d6abfb410ae9f7cc6c10f76a9bdaf3bbe4a: Status 404 returned error can't find the container with id 0bd85557138087f4b3e7e73eb6708d6abfb410ae9f7cc6c10f76a9bdaf3bbe4a Apr 16 04:45:00.591312 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.591273 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:45:00.762498 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:00.762462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" event={"ID":"2baf5da4-0252-4a03-9446-21f83fb88151","Type":"ContainerStarted","Data":"0bd85557138087f4b3e7e73eb6708d6abfb410ae9f7cc6c10f76a9bdaf3bbe4a"} Apr 16 04:45:02.613460 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:02.613423 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:45:03.775463 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:03.775426 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" event={"ID":"2baf5da4-0252-4a03-9446-21f83fb88151","Type":"ContainerStarted","Data":"bfcdc8973634697735991361a71726cbd754b535329b322cc2374bf49be53f7c"} Apr 16 04:45:03.797602 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:03.797556 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" podStartSLOduration=1.612465812 podStartE2EDuration="3.797541416s" podCreationTimestamp="2026-04-16 04:45:00 +0000 UTC" firstStartedPulling="2026-04-16 04:45:00.591416934 +0000 UTC m=+1256.407956992" lastFinishedPulling="2026-04-16 04:45:02.776492524 +0000 UTC m=+1258.593032596" observedRunningTime="2026-04-16 04:45:03.797035718 +0000 UTC m=+1259.613575798" watchObservedRunningTime="2026-04-16 04:45:03.797541416 +0000 UTC m=+1259.614081496" Apr 16 04:45:10.995075 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:10.995033 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:45:19.624742 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:19.624701 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:45:23.838307 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:23.838269 2578 generic.go:358] "Generic (PLEG): container finished" podID="2baf5da4-0252-4a03-9446-21f83fb88151" containerID="bfcdc8973634697735991361a71726cbd754b535329b322cc2374bf49be53f7c" exitCode=6 Apr 16 04:45:23.838678 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:23.838341 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" event={"ID":"2baf5da4-0252-4a03-9446-21f83fb88151","Type":"ContainerDied","Data":"bfcdc8973634697735991361a71726cbd754b535329b322cc2374bf49be53f7c"} Apr 16 04:45:23.838678 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:23.838628 2578 scope.go:117] "RemoveContainer" containerID="bfcdc8973634697735991361a71726cbd754b535329b322cc2374bf49be53f7c" Apr 16 04:45:24.843022 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:24.842990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" event={"ID":"2baf5da4-0252-4a03-9446-21f83fb88151","Type":"ContainerStarted","Data":"ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686"} Apr 16 04:45:27.903603 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:27.903563 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:45:36.714132 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:36.714099 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:45:44.909616 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:44.909585 2578 generic.go:358] "Generic (PLEG): container finished" podID="2baf5da4-0252-4a03-9446-21f83fb88151" containerID="ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686" exitCode=6 Apr 16 04:45:44.910017 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:44.909633 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" event={"ID":"2baf5da4-0252-4a03-9446-21f83fb88151","Type":"ContainerDied","Data":"ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686"} Apr 16 04:45:44.910017 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:44.909663 2578 scope.go:117] "RemoveContainer" containerID="bfcdc8973634697735991361a71726cbd754b535329b322cc2374bf49be53f7c" Apr 16 04:45:44.910017 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:44.909947 2578 scope.go:117] "RemoveContainer" containerID="ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686" Apr 16 04:45:44.910205 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:45:44.910143 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29605245-gtv5w_opendatahub(2baf5da4-0252-4a03-9446-21f83fb88151)\"" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" Apr 16 04:45:45.433537 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:45.433497 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:45:55.098404 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:55.098365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:45:57.787471 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:57.787438 2578 scope.go:117] "RemoveContainer" containerID="ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686" Apr 16 04:45:58.958432 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:58.958398 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" event={"ID":"2baf5da4-0252-4a03-9446-21f83fb88151","Type":"ContainerStarted","Data":"d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6"} Apr 16 04:45:59.995120 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:59.995089 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605245-gtv5w"] Apr 16 04:45:59.995521 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:45:59.995291 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" containerID="cri-o://d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6" gracePeriod=30 Apr 16 04:46:02.521208 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:02.521176 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:46:18.528653 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:18.528627 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" Apr 16 04:46:18.647842 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:18.647761 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtv2c\" (UniqueName: \"kubernetes.io/projected/2baf5da4-0252-4a03-9446-21f83fb88151-kube-api-access-rtv2c\") pod \"2baf5da4-0252-4a03-9446-21f83fb88151\" (UID: \"2baf5da4-0252-4a03-9446-21f83fb88151\") " Apr 16 04:46:18.649906 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:18.649876 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2baf5da4-0252-4a03-9446-21f83fb88151-kube-api-access-rtv2c" (OuterVolumeSpecName: "kube-api-access-rtv2c") pod "2baf5da4-0252-4a03-9446-21f83fb88151" (UID: "2baf5da4-0252-4a03-9446-21f83fb88151"). InnerVolumeSpecName "kube-api-access-rtv2c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:46:18.748440 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:18.748397 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtv2c\" (UniqueName: \"kubernetes.io/projected/2baf5da4-0252-4a03-9446-21f83fb88151-kube-api-access-rtv2c\") on node \"ip-10-0-140-211.ec2.internal\" DevicePath \"\"" Apr 16 04:46:19.018447 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.018413 2578 generic.go:358] "Generic (PLEG): container finished" podID="2baf5da4-0252-4a03-9446-21f83fb88151" containerID="d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6" exitCode=6 Apr 16 04:46:19.018639 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.018477 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" Apr 16 04:46:19.018639 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.018493 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" event={"ID":"2baf5da4-0252-4a03-9446-21f83fb88151","Type":"ContainerDied","Data":"d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6"} Apr 16 04:46:19.018639 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.018540 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605245-gtv5w" event={"ID":"2baf5da4-0252-4a03-9446-21f83fb88151","Type":"ContainerDied","Data":"0bd85557138087f4b3e7e73eb6708d6abfb410ae9f7cc6c10f76a9bdaf3bbe4a"} Apr 16 04:46:19.018639 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.018560 2578 scope.go:117] "RemoveContainer" containerID="d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6" Apr 16 04:46:19.026240 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.026224 2578 scope.go:117] "RemoveContainer" containerID="ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686" Apr 16 04:46:19.032804 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.032790 2578 scope.go:117] "RemoveContainer" containerID="d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6" Apr 16 04:46:19.033071 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:46:19.033054 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6\": container with ID starting with d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6 not found: ID does not exist" containerID="d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6" Apr 16 04:46:19.033105 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.033080 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6"} err="failed to get container status \"d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6\": rpc error: code = NotFound desc = could not find container \"d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6\": container with ID starting with d81c10a050ed2c7165e30fb0fcfcd230d906c2e02b9f50531ffdeb5ed37c89e6 not found: ID does not exist" Apr 16 04:46:19.033105 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.033098 2578 scope.go:117] "RemoveContainer" containerID="ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686" Apr 16 04:46:19.033360 ip-10-0-140-211 kubenswrapper[2578]: E0416 04:46:19.033337 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686\": container with ID starting with ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686 not found: ID does not exist" containerID="ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686" Apr 16 04:46:19.033401 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.033367 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686"} err="failed to get container status \"ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686\": rpc error: code = NotFound desc = could not find container \"ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686\": container with ID starting with ca23960d91ae4eec4ae8aeb4a64c308c001da11819038c7d7681c74c76c0a686 not found: ID does not exist" Apr 16 04:46:19.040426 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.040406 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605245-gtv5w"] Apr 16 04:46:19.043726 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:19.043708 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605245-gtv5w"] Apr 16 04:46:20.790734 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:46:20.790702 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" path="/var/lib/kubelet/pods/2baf5da4-0252-4a03-9446-21f83fb88151/volumes" Apr 16 04:47:13.093325 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:47:13.093268 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:47:17.090078 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:47:17.090047 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:47:27.487841 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:47:27.487799 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:47:58.490091 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:47:58.490050 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:48:41.487594 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:48:41.487563 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:48:49.403960 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:48:49.403919 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:48:57.685132 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:48:57.685100 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:49:04.783853 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:49:04.783825 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:49:04.788043 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:49:04.788018 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:49:06.497148 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:49:06.497116 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:49:16.292258 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:49:16.292174 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:49:23.995247 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:49:23.995213 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:49:31.896792 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:49:31.896757 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:49:39.787449 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:49:39.787415 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:49:48.690840 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:49:48.690800 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:49:57.192214 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:49:57.192179 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:50:05.884871 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:50:05.884840 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:50:13.491509 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:50:13.491477 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:50:31.781251 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:50:31.781214 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:50:40.194706 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:50:40.194667 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:50:48.991351 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:50:48.991268 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:50:56.794004 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:50:56.793972 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:51:13.894544 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:51:13.894507 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:51:22.194900 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:51:22.194866 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:51:31.189102 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:51:31.189068 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:51:38.989033 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:51:38.989002 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:51:48.398109 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:51:48.398074 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:51:56.899218 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:51:56.899185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:52:06.185309 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:52:06.185257 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:52:22.200315 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:52:22.200216 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:52:32.111781 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:52:32.111740 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:52:48.295594 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:52:48.295562 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:52:58.289987 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:52:58.289952 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:53:05.086327 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:53:05.086282 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:53:13.488497 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:53:13.488458 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:53:21.802781 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:53:21.802747 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:53:38.400236 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:53:38.400202 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:53:47.385666 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:53:47.385590 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:53:55.291484 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:53:55.291446 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:54:03.090634 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:03.090596 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:54:04.804793 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:04.804757 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:54:04.810037 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:04.810011 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:54:27.891922 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:27.891882 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:54:40.890132 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:40.890094 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-79dg5"] Apr 16 04:54:44.175751 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:44.175716 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-687c85d8d4-pvm7p_e885594c-0409-4740-b6e1-6683bb838e2b/authorino/0.log" Apr 16 04:54:48.492002 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:48.491969 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-c7946b447-2szfn_5b8d485b-b841-465a-b467-cd0eeb14836d/manager/0.log" Apr 16 04:54:50.081505 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:50.081472 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-687c85d8d4-pvm7p_e885594c-0409-4740-b6e1-6683bb838e2b/authorino/0.log" Apr 16 04:54:50.414732 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:50.414654 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-wqvvc_091e67ca-cd62-42b9-93f6-30f05c12506b/kuadrant-console-plugin/0.log" Apr 16 04:54:50.755348 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:50.755319 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-79dg5_5581d3a4-fa67-441a-8cb0-31637d096760/limitador/0.log" Apr 16 04:54:51.540038 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:51.540009 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-68d6cc647-pv6kd_a36c88de-8c4f-430f-8db6-fb29785a0264/kube-auth-proxy/0.log" Apr 16 04:54:56.416141 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416107 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4r9jh/must-gather-x6bgw"] Apr 16 04:54:56.416528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416393 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" Apr 16 04:54:56.416528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416406 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" Apr 16 04:54:56.416528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416416 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" Apr 16 04:54:56.416528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416422 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" Apr 16 04:54:56.416528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416431 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" Apr 16 04:54:56.416528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416436 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" Apr 16 04:54:56.416528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416483 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" Apr 16 04:54:56.416528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416491 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" Apr 16 04:54:56.416528 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.416497 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2baf5da4-0252-4a03-9446-21f83fb88151" containerName="cleanup" Apr 16 04:54:56.419231 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.419201 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4r9jh/must-gather-x6bgw" Apr 16 04:54:56.422286 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.422259 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4r9jh\"/\"openshift-service-ca.crt\"" Apr 16 04:54:56.422406 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.422288 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4r9jh\"/\"kube-root-ca.crt\"" Apr 16 04:54:56.423491 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.423478 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4r9jh\"/\"default-dockercfg-4f7rr\"" Apr 16 04:54:56.436098 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.436076 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4r9jh/must-gather-x6bgw"] Apr 16 04:54:56.462913 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.462889 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hzcm\" (UniqueName: \"kubernetes.io/projected/2992f9a0-743e-49f0-83e2-f512ad523c16-kube-api-access-7hzcm\") pod \"must-gather-x6bgw\" (UID: \"2992f9a0-743e-49f0-83e2-f512ad523c16\") " pod="openshift-must-gather-4r9jh/must-gather-x6bgw" Apr 16 04:54:56.463036 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.462939 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2992f9a0-743e-49f0-83e2-f512ad523c16-must-gather-output\") pod \"must-gather-x6bgw\" (UID: \"2992f9a0-743e-49f0-83e2-f512ad523c16\") " pod="openshift-must-gather-4r9jh/must-gather-x6bgw" Apr 16 04:54:56.563683 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.563656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hzcm\" (UniqueName: \"kubernetes.io/projected/2992f9a0-743e-49f0-83e2-f512ad523c16-kube-api-access-7hzcm\") pod \"must-gather-x6bgw\" (UID: \"2992f9a0-743e-49f0-83e2-f512ad523c16\") " pod="openshift-must-gather-4r9jh/must-gather-x6bgw" Apr 16 04:54:56.563823 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.563704 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2992f9a0-743e-49f0-83e2-f512ad523c16-must-gather-output\") pod \"must-gather-x6bgw\" (UID: \"2992f9a0-743e-49f0-83e2-f512ad523c16\") " pod="openshift-must-gather-4r9jh/must-gather-x6bgw" Apr 16 04:54:56.563986 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.563971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2992f9a0-743e-49f0-83e2-f512ad523c16-must-gather-output\") pod \"must-gather-x6bgw\" (UID: \"2992f9a0-743e-49f0-83e2-f512ad523c16\") " pod="openshift-must-gather-4r9jh/must-gather-x6bgw" Apr 16 04:54:56.571740 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.571714 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hzcm\" (UniqueName: \"kubernetes.io/projected/2992f9a0-743e-49f0-83e2-f512ad523c16-kube-api-access-7hzcm\") pod \"must-gather-x6bgw\" (UID: \"2992f9a0-743e-49f0-83e2-f512ad523c16\") " pod="openshift-must-gather-4r9jh/must-gather-x6bgw" Apr 16 04:54:56.727979 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.727957 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4r9jh/must-gather-x6bgw" Apr 16 04:54:56.847318 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.847257 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4r9jh/must-gather-x6bgw"] Apr 16 04:54:56.849511 ip-10-0-140-211 kubenswrapper[2578]: W0416 04:54:56.849482 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2992f9a0_743e_49f0_83e2_f512ad523c16.slice/crio-06749317cfbddba01d19427d97b1218e24cb70d8da1c5b1d99c4c3662b9163f1 WatchSource:0}: Error finding container 06749317cfbddba01d19427d97b1218e24cb70d8da1c5b1d99c4c3662b9163f1: Status 404 returned error can't find the container with id 06749317cfbddba01d19427d97b1218e24cb70d8da1c5b1d99c4c3662b9163f1 Apr 16 04:54:56.851142 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:56.851127 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:54:57.711995 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:57.711955 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4r9jh/must-gather-x6bgw" event={"ID":"2992f9a0-743e-49f0-83e2-f512ad523c16","Type":"ContainerStarted","Data":"06749317cfbddba01d19427d97b1218e24cb70d8da1c5b1d99c4c3662b9163f1"} Apr 16 04:54:58.718183 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:58.718139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4r9jh/must-gather-x6bgw" event={"ID":"2992f9a0-743e-49f0-83e2-f512ad523c16","Type":"ContainerStarted","Data":"af725605ee9e38a6263532934fbb3b14bffdd6e8f88449638f2b6e502a30f642"} Apr 16 04:54:58.718785 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:58.718192 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4r9jh/must-gather-x6bgw" event={"ID":"2992f9a0-743e-49f0-83e2-f512ad523c16","Type":"ContainerStarted","Data":"87dbe9dd0743cff596a4449d47bda6a9817bb4c616627a469b6f5dc3bc7748bf"} Apr 16 04:54:58.735127 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:58.735075 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4r9jh/must-gather-x6bgw" podStartSLOduration=1.927703288 podStartE2EDuration="2.735060642s" podCreationTimestamp="2026-04-16 04:54:56 +0000 UTC" firstStartedPulling="2026-04-16 04:54:56.85126907 +0000 UTC m=+1852.667809128" lastFinishedPulling="2026-04-16 04:54:57.658626419 +0000 UTC m=+1853.475166482" observedRunningTime="2026-04-16 04:54:58.733243332 +0000 UTC m=+1854.549783414" watchObservedRunningTime="2026-04-16 04:54:58.735060642 +0000 UTC m=+1854.551600722" Apr 16 04:54:59.242158 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:59.242132 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-95bvb_32c323b6-9b7e-46de-ac37-f304c3267420/global-pull-secret-syncer/0.log" Apr 16 04:54:59.331514 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:59.331485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-97kqm_dd90ea8e-03cf-460f-8525-c28183fc3a33/konnectivity-agent/0.log" Apr 16 04:54:59.451154 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:54:59.451125 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-211.ec2.internal_1a60e730c1a02bac81e499c95b0d4aa1/haproxy/0.log" Apr 16 04:55:04.076504 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:04.076071 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-687c85d8d4-pvm7p_e885594c-0409-4740-b6e1-6683bb838e2b/authorino/0.log" Apr 16 04:55:04.173854 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:04.173826 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-wqvvc_091e67ca-cd62-42b9-93f6-30f05c12506b/kuadrant-console-plugin/0.log" Apr 16 04:55:04.346807 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:04.346696 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-79dg5_5581d3a4-fa67-441a-8cb0-31637d096760/limitador/0.log" Apr 16 04:55:06.183701 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:06.183675 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7f76d64bc4-h4kxs_f703fc51-0bf6-45d6-8fff-2bce418bb4c4/metrics-server/0.log" Apr 16 04:55:06.238236 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:06.238179 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-56vp9_302d14c0-18e8-4940-948c-3b0fdf2bba1b/node-exporter/0.log" Apr 16 04:55:06.262967 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:06.262930 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-56vp9_302d14c0-18e8-4940-948c-3b0fdf2bba1b/kube-rbac-proxy/0.log" Apr 16 04:55:06.291379 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:06.291351 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-56vp9_302d14c0-18e8-4940-948c-3b0fdf2bba1b/init-textfile/0.log" Apr 16 04:55:07.694978 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.694943 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs"] Apr 16 04:55:07.701397 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.701370 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.707841 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.707812 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs"] Apr 16 04:55:07.762029 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.761991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-sys\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.762284 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.762264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-proc\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.762486 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.762468 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-podres\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.762639 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.762619 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwnd\" (UniqueName: \"kubernetes.io/projected/75e60604-e90b-4d0b-8713-9bfeb09876e6-kube-api-access-6nwnd\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.762808 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.762787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-lib-modules\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.863441 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.863401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwnd\" (UniqueName: \"kubernetes.io/projected/75e60604-e90b-4d0b-8713-9bfeb09876e6-kube-api-access-6nwnd\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.863631 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.863466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-lib-modules\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.863631 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.863588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-lib-modules\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.863747 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.863633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-sys\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.863747 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.863655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-proc\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.863747 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.863688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-podres\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.863866 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.863747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-proc\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.863866 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.863764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-sys\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.863866 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.863809 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75e60604-e90b-4d0b-8713-9bfeb09876e6-podres\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:07.871816 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:07.871786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwnd\" (UniqueName: \"kubernetes.io/projected/75e60604-e90b-4d0b-8713-9bfeb09876e6-kube-api-access-6nwnd\") pod \"perf-node-gather-daemonset-8mtvs\" (UID: \"75e60604-e90b-4d0b-8713-9bfeb09876e6\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:08.018105 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:08.018069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:08.164021 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:08.163996 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs"] Apr 16 04:55:08.765917 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:08.765883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" event={"ID":"75e60604-e90b-4d0b-8713-9bfeb09876e6","Type":"ContainerStarted","Data":"d421ded84b1efe9dce3c93a2f0153f36be19865f2c2b29dd520266af1992d7ad"} Apr 16 04:55:08.765917 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:08.765922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" event={"ID":"75e60604-e90b-4d0b-8713-9bfeb09876e6","Type":"ContainerStarted","Data":"88976bbbe992072c0deadfb7f66b2bcb58805246056e48ea783b8a90cfd4dd15"} Apr 16 04:55:08.766364 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:08.765953 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:08.798093 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:08.798047 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" podStartSLOduration=1.798031537 podStartE2EDuration="1.798031537s" podCreationTimestamp="2026-04-16 04:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:55:08.79561966 +0000 UTC m=+1864.612159744" watchObservedRunningTime="2026-04-16 04:55:08.798031537 +0000 UTC m=+1864.614571616" Apr 16 04:55:10.213566 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:10.213535 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zn869_4248c02e-a96d-4c4c-a829-8fa7ddd59809/dns/0.log" Apr 16 04:55:10.233975 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:10.233943 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zn869_4248c02e-a96d-4c4c-a829-8fa7ddd59809/kube-rbac-proxy/0.log" Apr 16 04:55:10.276850 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:10.276819 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s72p4_6630844f-3950-46f3-b23d-355ceec908ec/dns-node-resolver/0.log" Apr 16 04:55:10.776365 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:10.776337 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-746fd49c8c-95h62_fb841652-cbca-45d5-970b-df6c93b00e4f/registry/0.log" Apr 16 04:55:10.795510 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:10.795482 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ctskn_e9738669-31c5-4a1c-8e9b-4f0691464165/node-ca/0.log" Apr 16 04:55:11.768578 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:11.768549 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-68d6cc647-pv6kd_a36c88de-8c4f-430f-8db6-fb29785a0264/kube-auth-proxy/0.log" Apr 16 04:55:12.391964 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:12.391940 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bntvm_9cf31f3f-502a-403b-a015-b0d26d2ac92f/serve-healthcheck-canary/0.log" Apr 16 04:55:12.893489 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:12.893463 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9728b_fb2875df-44e0-4977-8768-9186ea96a129/kube-rbac-proxy/0.log" Apr 16 04:55:12.912697 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:12.912673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9728b_fb2875df-44e0-4977-8768-9186ea96a129/exporter/0.log" Apr 16 04:55:12.932203 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:12.932178 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9728b_fb2875df-44e0-4977-8768-9186ea96a129/extractor/0.log" Apr 16 04:55:14.781805 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:14.781769 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-8mtvs" Apr 16 04:55:15.041910 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:15.041845 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-c7946b447-2szfn_5b8d485b-b841-465a-b467-cd0eeb14836d/manager/0.log" Apr 16 04:55:16.400205 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:16.400181 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5988777b7d-ktcrc_5cc40a23-e80f-48be-9feb-bfc24b833509/manager/0.log" Apr 16 04:55:22.228354 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.228317 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-btwn7_0763391d-17aa-4fb0-a753-c705589537ab/kube-multus-additional-cni-plugins/0.log" Apr 16 04:55:22.254906 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.254882 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-btwn7_0763391d-17aa-4fb0-a753-c705589537ab/egress-router-binary-copy/0.log" Apr 16 04:55:22.275167 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.275145 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-btwn7_0763391d-17aa-4fb0-a753-c705589537ab/cni-plugins/0.log" Apr 16 04:55:22.295337 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.295311 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-btwn7_0763391d-17aa-4fb0-a753-c705589537ab/bond-cni-plugin/0.log" Apr 16 04:55:22.315008 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.314982 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-btwn7_0763391d-17aa-4fb0-a753-c705589537ab/routeoverride-cni/0.log" Apr 16 04:55:22.335692 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.335659 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-btwn7_0763391d-17aa-4fb0-a753-c705589537ab/whereabouts-cni-bincopy/0.log" Apr 16 04:55:22.357456 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.357431 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-btwn7_0763391d-17aa-4fb0-a753-c705589537ab/whereabouts-cni/0.log" Apr 16 04:55:22.624248 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.624177 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rk4kv_6effaf82-47fd-4e1e-904a-407184af8a9d/kube-multus/0.log" Apr 16 04:55:22.740314 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.740274 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qhcj5_b30a87b4-65a2-4504-be52-b10fb247dedb/network-metrics-daemon/0.log" Apr 16 04:55:22.759658 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:22.759632 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qhcj5_b30a87b4-65a2-4504-be52-b10fb247dedb/kube-rbac-proxy/0.log" Apr 16 04:55:23.807668 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:23.807635 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-controller/0.log" Apr 16 04:55:23.825583 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:23.825558 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/0.log" Apr 16 04:55:23.838383 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:23.838356 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovn-acl-logging/1.log" Apr 16 04:55:23.855672 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:23.855640 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/kube-rbac-proxy-node/0.log" Apr 16 04:55:23.875814 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:23.875793 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 04:55:23.898266 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:23.898244 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/northd/0.log" Apr 16 04:55:23.917317 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:23.917281 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/nbdb/0.log" Apr 16 04:55:23.939360 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:23.939327 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/sbdb/0.log" Apr 16 04:55:24.051003 ip-10-0-140-211 kubenswrapper[2578]: I0416 04:55:24.050961 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vmzk_b57f188f-8b5d-4bf1-9ab4-39e808fa255e/ovnkube-controller/0.log"